Nov 25 15:04:13 crc systemd[1]: Starting Kubernetes Kubelet... Nov 25 15:04:13 crc restorecon[4652]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:13 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:14 crc restorecon[4652]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:04:14 crc restorecon[4652]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 25 15:04:16 crc kubenswrapper[4965]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:04:16 crc kubenswrapper[4965]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 25 15:04:16 crc kubenswrapper[4965]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:04:16 crc kubenswrapper[4965]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:04:16 crc kubenswrapper[4965]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 25 15:04:16 crc kubenswrapper[4965]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.250397 4965 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271386 4965 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271418 4965 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271427 4965 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271436 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271445 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271455 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271463 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271471 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271479 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271487 4965 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271495 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271503 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271513 4965 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271524 4965 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271534 4965 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271543 4965 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271552 4965 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271560 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271568 4965 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271576 4965 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271597 4965 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271605 4965 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271613 4965 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271621 4965 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271629 4965 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271637 4965 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271646 4965 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271655 4965 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271663 4965 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271670 4965 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271678 4965 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271686 4965 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271694 4965 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271702 4965 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271710 4965 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271720 4965 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271729 4965 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271739 4965 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271747 4965 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271757 4965 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271766 4965 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271775 4965 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271824 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271833 4965 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271842 4965 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271849 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271857 4965 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271867 4965 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271874 4965 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271883 4965 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271890 4965 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271898 4965 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271906 4965 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271913 4965 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271921 4965 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271931 4965 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271941 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271949 4965 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271957 4965 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271988 4965 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.271996 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.272004 4965 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.272011 4965 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.272019 4965 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.272027 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.272034 4965 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.272042 4965 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.272050 4965 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.272058 4965 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.272065 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.272073 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272222 4965 flags.go:64] FLAG: --address="0.0.0.0" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272239 4965 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272260 4965 flags.go:64] FLAG: --anonymous-auth="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272289 4965 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272305 4965 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272318 4965 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272334 4965 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272356 4965 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272366 4965 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272375 4965 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272385 4965 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272395 4965 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272404 4965 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272413 4965 flags.go:64] FLAG: --cgroup-root="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272422 4965 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272431 4965 flags.go:64] FLAG: --client-ca-file="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272440 4965 flags.go:64] FLAG: --cloud-config="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272449 4965 flags.go:64] FLAG: --cloud-provider="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272458 4965 flags.go:64] FLAG: --cluster-dns="[]" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272471 4965 flags.go:64] FLAG: --cluster-domain="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272480 4965 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272489 4965 flags.go:64] FLAG: --config-dir="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272498 4965 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272508 4965 flags.go:64] FLAG: --container-log-max-files="5" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272519 4965 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272527 4965 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272537 4965 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272546 4965 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272556 4965 flags.go:64] FLAG: --contention-profiling="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272564 4965 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272573 4965 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272582 4965 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272591 4965 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272603 4965 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272613 4965 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272622 4965 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272630 4965 flags.go:64] FLAG: --enable-load-reader="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272639 4965 flags.go:64] FLAG: --enable-server="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272650 4965 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272661 4965 flags.go:64] FLAG: --event-burst="100" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272670 4965 flags.go:64] FLAG: --event-qps="50" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272679 4965 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272688 4965 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272698 4965 flags.go:64] FLAG: --eviction-hard="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272709 4965 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272718 4965 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272728 4965 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272737 4965 flags.go:64] FLAG: --eviction-soft="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272748 4965 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272757 4965 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272766 4965 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272775 4965 flags.go:64] FLAG: --experimental-mounter-path="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272784 4965 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272792 4965 flags.go:64] FLAG: --fail-swap-on="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272801 4965 flags.go:64] FLAG: --feature-gates="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272812 4965 flags.go:64] FLAG: --file-check-frequency="20s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272821 4965 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272830 4965 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272840 4965 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272850 4965 flags.go:64] FLAG: --healthz-port="10248" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272859 4965 flags.go:64] FLAG: --help="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272868 4965 flags.go:64] FLAG: --hostname-override="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272876 4965 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272885 4965 flags.go:64] FLAG: --http-check-frequency="20s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272895 4965 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272904 4965 flags.go:64] FLAG: --image-credential-provider-config="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272913 4965 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272921 4965 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272930 4965 flags.go:64] FLAG: --image-service-endpoint="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272939 4965 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272949 4965 flags.go:64] FLAG: --kube-api-burst="100" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.272958 4965 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273001 4965 flags.go:64] FLAG: --kube-api-qps="50" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273011 4965 flags.go:64] FLAG: --kube-reserved="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273020 4965 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273028 4965 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273038 4965 flags.go:64] FLAG: --kubelet-cgroups="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273047 4965 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273203 4965 flags.go:64] FLAG: --lock-file="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273213 4965 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273223 4965 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273232 4965 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273245 4965 flags.go:64] FLAG: --log-json-split-stream="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273254 4965 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273263 4965 flags.go:64] FLAG: --log-text-split-stream="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273273 4965 flags.go:64] FLAG: --logging-format="text" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273281 4965 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273291 4965 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273300 4965 flags.go:64] FLAG: --manifest-url="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273309 4965 flags.go:64] FLAG: --manifest-url-header="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273321 4965 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273331 4965 flags.go:64] FLAG: --max-open-files="1000000" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273342 4965 flags.go:64] FLAG: --max-pods="110" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273351 4965 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273360 4965 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273369 4965 flags.go:64] FLAG: --memory-manager-policy="None" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273378 4965 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273388 4965 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273396 4965 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273406 4965 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273426 4965 flags.go:64] FLAG: --node-status-max-images="50" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273436 4965 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273445 4965 flags.go:64] FLAG: --oom-score-adj="-999" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273454 4965 flags.go:64] FLAG: --pod-cidr="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273463 4965 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273477 4965 flags.go:64] FLAG: --pod-manifest-path="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273486 4965 flags.go:64] FLAG: --pod-max-pids="-1" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273496 4965 flags.go:64] FLAG: --pods-per-core="0" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273504 4965 flags.go:64] FLAG: --port="10250" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273514 4965 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273522 4965 flags.go:64] FLAG: --provider-id="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273532 4965 flags.go:64] FLAG: --qos-reserved="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273541 4965 flags.go:64] FLAG: --read-only-port="10255" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273550 4965 flags.go:64] FLAG: --register-node="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273559 4965 flags.go:64] FLAG: --register-schedulable="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273568 4965 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273583 4965 flags.go:64] FLAG: --registry-burst="10" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273592 4965 flags.go:64] FLAG: --registry-qps="5" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273600 4965 flags.go:64] FLAG: --reserved-cpus="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273609 4965 flags.go:64] FLAG: --reserved-memory="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273620 4965 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273629 4965 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273638 4965 flags.go:64] FLAG: --rotate-certificates="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273647 4965 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273656 4965 flags.go:64] FLAG: --runonce="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273665 4965 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273675 4965 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273684 4965 flags.go:64] FLAG: --seccomp-default="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273693 4965 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273701 4965 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273711 4965 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273720 4965 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273729 4965 flags.go:64] FLAG: --storage-driver-password="root" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273738 4965 flags.go:64] FLAG: --storage-driver-secure="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273748 4965 flags.go:64] FLAG: --storage-driver-table="stats" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273757 4965 flags.go:64] FLAG: --storage-driver-user="root" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273766 4965 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273775 4965 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273785 4965 flags.go:64] FLAG: --system-cgroups="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273793 4965 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273807 4965 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273816 4965 flags.go:64] FLAG: --tls-cert-file="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273825 4965 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273838 4965 flags.go:64] FLAG: --tls-min-version="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273856 4965 flags.go:64] FLAG: --tls-private-key-file="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273864 4965 flags.go:64] FLAG: --topology-manager-policy="none" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273874 4965 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273883 4965 flags.go:64] FLAG: --topology-manager-scope="container" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273892 4965 flags.go:64] FLAG: --v="2" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273903 4965 flags.go:64] FLAG: --version="false" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273914 4965 flags.go:64] FLAG: --vmodule="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273929 4965 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.273939 4965 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274165 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274176 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274186 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274194 4965 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274204 4965 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274214 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274222 4965 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274230 4965 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274238 4965 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274245 4965 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274253 4965 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274261 4965 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274273 4965 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274281 4965 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274288 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274297 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274304 4965 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274312 4965 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274320 4965 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274328 4965 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274336 4965 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274344 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274351 4965 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274362 4965 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274370 4965 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274377 4965 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274385 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274393 4965 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274401 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274408 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274419 4965 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274429 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274438 4965 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274447 4965 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274457 4965 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274466 4965 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274476 4965 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274484 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274493 4965 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274501 4965 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274510 4965 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274518 4965 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274526 4965 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274533 4965 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274545 4965 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274555 4965 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274567 4965 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274576 4965 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274585 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274593 4965 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274602 4965 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274612 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274621 4965 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274630 4965 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274638 4965 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274678 4965 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274688 4965 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274697 4965 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274707 4965 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274715 4965 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274724 4965 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274732 4965 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274741 4965 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274748 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274759 4965 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274769 4965 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274779 4965 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274788 4965 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274797 4965 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274805 4965 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.274815 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.274837 4965 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.284847 4965 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.284884 4965 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.284946 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.284954 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.284958 4965 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.284983 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.284988 4965 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.284992 4965 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.284996 4965 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285000 4965 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285003 4965 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285007 4965 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285010 4965 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285015 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285019 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285023 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285026 4965 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285030 4965 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285033 4965 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285037 4965 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285040 4965 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285044 4965 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285047 4965 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285051 4965 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285055 4965 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285060 4965 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285064 4965 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285068 4965 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285072 4965 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285077 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285083 4965 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285089 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285094 4965 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285101 4965 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285107 4965 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285111 4965 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285116 4965 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285120 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285125 4965 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285128 4965 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285132 4965 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285136 4965 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285141 4965 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285145 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285148 4965 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285152 4965 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285157 4965 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285161 4965 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285165 4965 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285169 4965 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285173 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285177 4965 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285180 4965 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285184 4965 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285188 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285193 4965 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285197 4965 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285202 4965 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285206 4965 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285211 4965 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285214 4965 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285219 4965 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285225 4965 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285232 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285237 4965 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285241 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285245 4965 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285250 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285254 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285259 4965 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285265 4965 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285269 4965 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285274 4965 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.285281 4965 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285409 4965 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285417 4965 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285422 4965 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285425 4965 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285429 4965 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285434 4965 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285440 4965 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285444 4965 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285448 4965 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285454 4965 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285458 4965 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285461 4965 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285466 4965 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285471 4965 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285475 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285478 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285482 4965 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285486 4965 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285490 4965 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285494 4965 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285498 4965 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285501 4965 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285505 4965 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285509 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285512 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285516 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285519 4965 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285523 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285527 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285530 4965 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285534 4965 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285538 4965 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285542 4965 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285546 4965 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285551 4965 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285554 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285558 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285561 4965 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285565 4965 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285568 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285573 4965 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285577 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285580 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285584 4965 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285588 4965 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285591 4965 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285595 4965 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285598 4965 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285602 4965 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285605 4965 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285609 4965 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285612 4965 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285616 4965 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285619 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285623 4965 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285626 4965 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285630 4965 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285633 4965 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285637 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285640 4965 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285644 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285647 4965 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285651 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285654 4965 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285657 4965 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285661 4965 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285665 4965 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285668 4965 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285672 4965 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285675 4965 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.285679 4965 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.285685 4965 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.285861 4965 server.go:940] "Client rotation is on, will bootstrap in background" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.291592 4965 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.291687 4965 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.302424 4965 server.go:997] "Starting client certificate rotation" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.302454 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.302644 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-14 21:31:06.3964751 +0000 UTC Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.302728 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1206h26m50.09375037s for next certificate rotation Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.363834 4965 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.379596 4965 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.446019 4965 log.go:25] "Validated CRI v1 runtime API" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.554369 4965 log.go:25] "Validated CRI v1 image API" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.560236 4965 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.599685 4965 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-25-14-58-32-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.599941 4965 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.620759 4965 manager.go:217] Machine: {Timestamp:2025-11-25 15:04:16.611704311 +0000 UTC m=+1.579298097 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:69eb65a6-67c0-4926-88da-f3ca03c4aea4 BootID:10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:cd:b3:12 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:cd:b3:12 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:dd:1e:ea Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a5:f3:6b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6e:8a:d9 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:db:5d:dc Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:44:cc:38 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:1e:07:0c:b4:53:2c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:16:52:80:04:c0:19 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.621280 4965 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.621501 4965 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.621902 4965 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.622238 4965 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.622358 4965 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.622659 4965 topology_manager.go:138] "Creating topology manager with none policy" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.622747 4965 container_manager_linux.go:303] "Creating device plugin manager" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.623385 4965 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.623505 4965 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.623811 4965 state_mem.go:36] "Initialized new in-memory state store" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.623993 4965 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.642723 4965 kubelet.go:418] "Attempting to sync node with API server" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.642792 4965 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.642940 4965 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.643019 4965 kubelet.go:324] "Adding apiserver pod source" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.643113 4965 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.649582 4965 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.651490 4965 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.653683 4965 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.655884 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656009 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656036 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656058 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656088 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656108 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.656064 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.656206 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656136 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656268 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656288 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656303 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656322 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.656338 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.656311 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.656390 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.658541 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.659399 4965 server.go:1280] "Started kubelet" Nov 25 15:04:16 crc systemd[1]: Started Kubernetes Kubelet. Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.665344 4965 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.665280 4965 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.665905 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.666895 4965 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.667802 4965 server.go:460] "Adding debug handlers to kubelet server" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.698251 4965 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.698601 4965 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 05:41:37.717303886 +0000 UTC Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.702053 4965 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.704737 4965 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.704877 4965 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.705133 4965 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.706704 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.706819 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.707061 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.709668 4965 factory.go:55] Registering systemd factory Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.709713 4965 factory.go:221] Registration of the systemd container factory successfully Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.713533 4965 factory.go:153] Registering CRI-O factory Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.713569 4965 factory.go:221] Registration of the crio container factory successfully Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.713664 4965 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.713719 4965 factory.go:103] Registering Raw factory Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.713741 4965 manager.go:1196] Started watching for new ooms in manager Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.714008 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="200ms" Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.713628 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.176:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b48318a7751ab default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 15:04:16.659354027 +0000 UTC m=+1.626947843,LastTimestamp:2025-11-25 15:04:16.659354027 +0000 UTC m=+1.626947843,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.718070 4965 manager.go:319] Starting recovery of all containers Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721555 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721602 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721650 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721662 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721675 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721687 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721717 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721731 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721750 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721765 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721805 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721826 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.721839 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724369 4965 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724426 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724442 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724453 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724464 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724474 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724487 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724508 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724523 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724536 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724548 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724560 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724571 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724581 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724592 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724602 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724640 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724651 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724659 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724669 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724678 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724688 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724699 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724709 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724718 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724727 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724740 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724749 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724758 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724766 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724780 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724794 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724808 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724821 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724831 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724841 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724850 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724861 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724871 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.724995 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725016 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725029 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725042 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725052 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725062 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725070 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725080 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725091 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725106 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725118 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725132 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725143 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725152 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725161 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725170 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725179 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725191 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725202 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725210 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725263 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725273 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725285 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725294 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725303 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725314 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725324 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725333 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725342 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725351 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725359 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725368 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725378 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725387 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725396 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725406 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725415 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725427 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725437 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725453 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725463 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725473 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725483 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725493 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725505 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725519 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725532 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725553 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725562 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725573 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725586 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725598 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725611 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725631 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725645 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725659 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725672 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725685 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725699 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725708 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725717 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725730 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725743 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725756 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725769 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725779 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725790 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725801 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725810 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725818 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725827 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725837 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725846 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725856 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725867 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725878 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725889 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725902 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725914 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725927 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725938 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725953 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.725998 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726009 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726021 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726039 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726049 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726060 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726068 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726078 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726090 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726102 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726114 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726129 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726142 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726154 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726167 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726178 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726190 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726203 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726217 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726229 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726244 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726256 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726267 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726279 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726292 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726303 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726314 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726326 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726337 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726348 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726359 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726374 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726389 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726402 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726414 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726425 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726437 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726449 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726462 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726475 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726486 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726497 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726508 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726521 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726534 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726546 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726558 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726570 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726584 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726596 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726608 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726620 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726633 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726645 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726656 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726670 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726686 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726697 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726710 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726720 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726730 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726743 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726755 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726767 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726780 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726794 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726807 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726819 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726830 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726842 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726853 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726865 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726876 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726886 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726897 4965 reconstruct.go:97] "Volume reconstruction finished" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.726905 4965 reconciler.go:26] "Reconciler: start to sync state" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.738624 4965 manager.go:324] Recovery completed Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.754560 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.756299 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.756326 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.756334 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.757283 4965 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.757303 4965 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.757320 4965 state_mem.go:36] "Initialized new in-memory state store" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.768435 4965 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.770148 4965 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.770240 4965 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.770281 4965 kubelet.go:2335] "Starting kubelet main sync loop" Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.770437 4965 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 25 15:04:16 crc kubenswrapper[4965]: W1125 15:04:16.770824 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.770934 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.807960 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.871056 4965 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.908584 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 15:04:16 crc kubenswrapper[4965]: E1125 15:04:16.915337 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="400ms" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.957715 4965 policy_none.go:49] "None policy: Start" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.959477 4965 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 25 15:04:16 crc kubenswrapper[4965]: I1125 15:04:16.959503 4965 state_mem.go:35] "Initializing new in-memory state store" Nov 25 15:04:17 crc kubenswrapper[4965]: E1125 15:04:17.009425 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.069076 4965 manager.go:334] "Starting Device Plugin manager" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.069280 4965 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.069297 4965 server.go:79] "Starting device plugin registration server" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.069652 4965 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.069662 4965 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.070088 4965 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.070155 4965 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.070161 4965 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.071245 4965 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.071317 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.072191 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.072264 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.072277 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.072436 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.073081 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.073124 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.074208 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.074254 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.074265 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.074358 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.074939 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.074980 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.075520 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.075547 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.075557 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.076218 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.076236 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.076246 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.076417 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.076881 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.076902 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.077360 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.077385 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.077419 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.080124 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.080151 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.080162 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.080356 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.080788 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.080820 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085445 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085485 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085448 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085504 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085488 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085528 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085532 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085511 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085677 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085751 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.085793 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.086557 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.086575 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.086586 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: E1125 15:04:17.089960 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.131452 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.131626 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.131700 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.131770 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.131836 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.131906 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.132002 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.132081 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.132155 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.132220 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.132288 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.132351 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.132469 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.132561 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.132612 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.170376 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.172590 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.172618 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.172629 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.172654 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:04:17 crc kubenswrapper[4965]: E1125 15:04:17.173128 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234163 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234215 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234239 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234260 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234282 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234301 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234321 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234339 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234363 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234371 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234405 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234422 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234418 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234463 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234480 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234384 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234498 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234454 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234513 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234517 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234549 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234582 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234610 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234639 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234670 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234676 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234723 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234759 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234638 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.234815 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: E1125 15:04:17.316842 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="800ms" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.373896 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.376003 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.376054 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.376066 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.376141 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:04:17 crc kubenswrapper[4965]: E1125 15:04:17.376615 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.414216 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.423515 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.444675 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.465798 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.472914 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 25 15:04:17 crc kubenswrapper[4965]: W1125 15:04:17.490804 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e30ca031bcef77145cc013dd996e9ff53b6cbb16bafc24916d16f6d0a07b6982 WatchSource:0}: Error finding container e30ca031bcef77145cc013dd996e9ff53b6cbb16bafc24916d16f6d0a07b6982: Status 404 returned error can't find the container with id e30ca031bcef77145cc013dd996e9ff53b6cbb16bafc24916d16f6d0a07b6982 Nov 25 15:04:17 crc kubenswrapper[4965]: W1125 15:04:17.496606 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-55f6457348a90f60ad25a0b7692444b8f8c5776642f903af740eb96ea178599c WatchSource:0}: Error finding container 55f6457348a90f60ad25a0b7692444b8f8c5776642f903af740eb96ea178599c: Status 404 returned error can't find the container with id 55f6457348a90f60ad25a0b7692444b8f8c5776642f903af740eb96ea178599c Nov 25 15:04:17 crc kubenswrapper[4965]: W1125 15:04:17.501847 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-5c505f11ebe5ac733a61dea171f5fbc4852de9a46d2c8c0ac457423508f724f6 WatchSource:0}: Error finding container 5c505f11ebe5ac733a61dea171f5fbc4852de9a46d2c8c0ac457423508f724f6: Status 404 returned error can't find the container with id 5c505f11ebe5ac733a61dea171f5fbc4852de9a46d2c8c0ac457423508f724f6 Nov 25 15:04:17 crc kubenswrapper[4965]: W1125 15:04:17.509305 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-ecca2b87eba530d342567a8ef33e853e1aee2b06ca6197de563099ac63112249 WatchSource:0}: Error finding container ecca2b87eba530d342567a8ef33e853e1aee2b06ca6197de563099ac63112249: Status 404 returned error can't find the container with id ecca2b87eba530d342567a8ef33e853e1aee2b06ca6197de563099ac63112249 Nov 25 15:04:17 crc kubenswrapper[4965]: W1125 15:04:17.511069 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b7cee554d85e147c8e893e3e8298d9e87e327ad5663e70918daadf8e81e66e1e WatchSource:0}: Error finding container b7cee554d85e147c8e893e3e8298d9e87e327ad5663e70918daadf8e81e66e1e: Status 404 returned error can't find the container with id b7cee554d85e147c8e893e3e8298d9e87e327ad5663e70918daadf8e81e66e1e Nov 25 15:04:17 crc kubenswrapper[4965]: W1125 15:04:17.530407 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:17 crc kubenswrapper[4965]: E1125 15:04:17.530592 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.667727 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.698758 4965 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 21:04:49.580156453 +0000 UTC Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.698849 4965 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 294h0m31.881310342s for next certificate rotation Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.775038 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"55f6457348a90f60ad25a0b7692444b8f8c5776642f903af740eb96ea178599c"} Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.776433 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e30ca031bcef77145cc013dd996e9ff53b6cbb16bafc24916d16f6d0a07b6982"} Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.776748 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.777430 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b7cee554d85e147c8e893e3e8298d9e87e327ad5663e70918daadf8e81e66e1e"} Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.777907 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.777956 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.777990 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.778016 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.778389 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ecca2b87eba530d342567a8ef33e853e1aee2b06ca6197de563099ac63112249"} Nov 25 15:04:17 crc kubenswrapper[4965]: E1125 15:04:17.778615 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 25 15:04:17 crc kubenswrapper[4965]: I1125 15:04:17.779347 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5c505f11ebe5ac733a61dea171f5fbc4852de9a46d2c8c0ac457423508f724f6"} Nov 25 15:04:17 crc kubenswrapper[4965]: W1125 15:04:17.980538 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:17 crc kubenswrapper[4965]: E1125 15:04:17.980663 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:17 crc kubenswrapper[4965]: W1125 15:04:17.985484 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:17 crc kubenswrapper[4965]: E1125 15:04:17.985559 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:18 crc kubenswrapper[4965]: E1125 15:04:18.117712 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="1.6s" Nov 25 15:04:18 crc kubenswrapper[4965]: W1125 15:04:18.239186 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:18 crc kubenswrapper[4965]: E1125 15:04:18.239469 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.579221 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.581514 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.581554 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.581565 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.581589 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:04:18 crc kubenswrapper[4965]: E1125 15:04:18.582057 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.667142 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.784259 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8" exitCode=0 Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.784393 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8"} Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.785007 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.787623 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.787668 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.787681 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.789450 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.789462 4965 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11" exitCode=0 Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.789527 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11"} Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.789685 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.791144 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.791187 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.791200 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.791676 4965 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165" exitCode=0 Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.791755 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165"} Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.791860 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.791908 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.791926 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.791936 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.792683 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.792713 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.792745 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.797102 4965 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4" exitCode=0 Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.797176 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4"} Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.797260 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.798495 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.798529 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.798541 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.800425 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f"} Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.800475 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14"} Nov 25 15:04:18 crc kubenswrapper[4965]: I1125 15:04:18.800490 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.667284 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:19 crc kubenswrapper[4965]: E1125 15:04:19.719089 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="3.2s" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.804300 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.804382 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.805279 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.805301 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.805310 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.807016 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"48c771bd20dbd5a19b758162af3912cbb8238aac32e0e3d45c1994aaf053af66"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.807039 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.807049 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.807062 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.807071 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.807096 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.807929 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.807951 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.807959 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.808951 4965 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf" exitCode=0 Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.809008 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.809097 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.809889 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.809939 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.809950 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.810287 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8cf7c955a7ce02cd5af2d3b7f447e561eea55e5104fa2eb07d863a90a82b0809"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.810298 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.811685 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.811706 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.811714 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.811946 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.812036 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.812101 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521"} Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.812200 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.812834 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.812851 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:19 crc kubenswrapper[4965]: I1125 15:04:19.812858 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:19 crc kubenswrapper[4965]: W1125 15:04:19.911561 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:19 crc kubenswrapper[4965]: E1125 15:04:19.911625 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.126873 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.182199 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.183498 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.183520 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.183530 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.183565 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:04:20 crc kubenswrapper[4965]: E1125 15:04:20.184004 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 25 15:04:20 crc kubenswrapper[4965]: W1125 15:04:20.368481 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:20 crc kubenswrapper[4965]: E1125 15:04:20.368593 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:20 crc kubenswrapper[4965]: W1125 15:04:20.495052 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 25 15:04:20 crc kubenswrapper[4965]: E1125 15:04:20.495461 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.601568 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.817322 4965 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49" exitCode=0 Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.817512 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.818088 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.818409 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49"} Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.818489 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.818829 4965 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.818857 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.819236 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.820269 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.820297 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.820308 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.820882 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.820903 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.820914 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.821465 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.821488 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.821498 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.822020 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.822041 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.822053 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.822497 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.822520 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:20 crc kubenswrapper[4965]: I1125 15:04:20.822529 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:21 crc kubenswrapper[4965]: I1125 15:04:21.823813 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc"} Nov 25 15:04:21 crc kubenswrapper[4965]: I1125 15:04:21.823846 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:21 crc kubenswrapper[4965]: I1125 15:04:21.823861 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20"} Nov 25 15:04:21 crc kubenswrapper[4965]: I1125 15:04:21.823881 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839"} Nov 25 15:04:21 crc kubenswrapper[4965]: I1125 15:04:21.824848 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:21 crc kubenswrapper[4965]: I1125 15:04:21.824902 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:21 crc kubenswrapper[4965]: I1125 15:04:21.824914 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:22 crc kubenswrapper[4965]: I1125 15:04:22.835928 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154"} Nov 25 15:04:22 crc kubenswrapper[4965]: I1125 15:04:22.836084 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1"} Nov 25 15:04:22 crc kubenswrapper[4965]: I1125 15:04:22.836295 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:22 crc kubenswrapper[4965]: I1125 15:04:22.837238 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:22 crc kubenswrapper[4965]: I1125 15:04:22.837275 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:22 crc kubenswrapper[4965]: I1125 15:04:22.837287 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.385070 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.386938 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.387010 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.387026 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.387061 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.396410 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.396825 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.398352 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.398395 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.398409 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.429580 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.429813 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.431147 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.431233 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.431254 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.838368 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.839482 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.839517 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:23 crc kubenswrapper[4965]: I1125 15:04:23.839527 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.117399 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.117676 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.119767 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.119831 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.119845 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.125669 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.173443 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.301099 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.840734 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.840812 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.840921 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.842127 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.842129 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.842194 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.842225 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.842248 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:24 crc kubenswrapper[4965]: I1125 15:04:24.842228 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:25 crc kubenswrapper[4965]: I1125 15:04:25.842670 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:25 crc kubenswrapper[4965]: I1125 15:04:25.843909 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:25 crc kubenswrapper[4965]: I1125 15:04:25.843961 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:25 crc kubenswrapper[4965]: I1125 15:04:25.844021 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:26 crc kubenswrapper[4965]: I1125 15:04:26.397350 4965 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 15:04:26 crc kubenswrapper[4965]: I1125 15:04:26.397463 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:04:27 crc kubenswrapper[4965]: E1125 15:04:27.090262 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 25 15:04:27 crc kubenswrapper[4965]: I1125 15:04:27.345276 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:04:27 crc kubenswrapper[4965]: I1125 15:04:27.345557 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:27 crc kubenswrapper[4965]: I1125 15:04:27.347306 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:27 crc kubenswrapper[4965]: I1125 15:04:27.347529 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:27 crc kubenswrapper[4965]: I1125 15:04:27.347718 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.668512 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.692172 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.692428 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:30 crc kubenswrapper[4965]: W1125 15:04:30.746753 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.746953 4965 trace.go:236] Trace[1899595955]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 15:04:20.745) (total time: 10001ms): Nov 25 15:04:30 crc kubenswrapper[4965]: Trace[1899595955]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:04:30.746) Nov 25 15:04:30 crc kubenswrapper[4965]: Trace[1899595955]: [10.001433355s] [10.001433355s] END Nov 25 15:04:30 crc kubenswrapper[4965]: E1125 15:04:30.747116 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.787388 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.788123 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.788222 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.908695 4965 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43240->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.908739 4965 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43246->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.908768 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43240->192.168.126.11:17697: read: connection reset by peer" Nov 25 15:04:30 crc kubenswrapper[4965]: I1125 15:04:30.908785 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43246->192.168.126.11:17697: read: connection reset by peer" Nov 25 15:04:31 crc kubenswrapper[4965]: I1125 15:04:31.861961 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 15:04:31 crc kubenswrapper[4965]: I1125 15:04:31.863933 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="48c771bd20dbd5a19b758162af3912cbb8238aac32e0e3d45c1994aaf053af66" exitCode=255 Nov 25 15:04:31 crc kubenswrapper[4965]: I1125 15:04:31.863989 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"48c771bd20dbd5a19b758162af3912cbb8238aac32e0e3d45c1994aaf053af66"} Nov 25 15:04:31 crc kubenswrapper[4965]: I1125 15:04:31.864153 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:31 crc kubenswrapper[4965]: I1125 15:04:31.864942 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:31 crc kubenswrapper[4965]: I1125 15:04:31.865024 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:31 crc kubenswrapper[4965]: I1125 15:04:31.865051 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:31 crc kubenswrapper[4965]: I1125 15:04:31.865845 4965 scope.go:117] "RemoveContainer" containerID="48c771bd20dbd5a19b758162af3912cbb8238aac32e0e3d45c1994aaf053af66" Nov 25 15:04:32 crc kubenswrapper[4965]: I1125 15:04:32.021123 4965 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 25 15:04:32 crc kubenswrapper[4965]: I1125 15:04:32.021491 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 25 15:04:32 crc kubenswrapper[4965]: I1125 15:04:32.026283 4965 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 25 15:04:32 crc kubenswrapper[4965]: I1125 15:04:32.026337 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 25 15:04:32 crc kubenswrapper[4965]: I1125 15:04:32.867382 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 15:04:32 crc kubenswrapper[4965]: I1125 15:04:32.869289 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3"} Nov 25 15:04:32 crc kubenswrapper[4965]: I1125 15:04:32.869479 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:32 crc kubenswrapper[4965]: I1125 15:04:32.870533 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:32 crc kubenswrapper[4965]: I1125 15:04:32.870564 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:32 crc kubenswrapper[4965]: I1125 15:04:32.870574 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:33 crc kubenswrapper[4965]: I1125 15:04:33.436022 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:33 crc kubenswrapper[4965]: I1125 15:04:33.871289 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:33 crc kubenswrapper[4965]: I1125 15:04:33.871501 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:33 crc kubenswrapper[4965]: I1125 15:04:33.872093 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:33 crc kubenswrapper[4965]: I1125 15:04:33.872145 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:33 crc kubenswrapper[4965]: I1125 15:04:33.872156 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:33 crc kubenswrapper[4965]: I1125 15:04:33.874710 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:34 crc kubenswrapper[4965]: I1125 15:04:34.873508 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:34 crc kubenswrapper[4965]: I1125 15:04:34.874253 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:34 crc kubenswrapper[4965]: I1125 15:04:34.874282 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:34 crc kubenswrapper[4965]: I1125 15:04:34.874292 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:35 crc kubenswrapper[4965]: I1125 15:04:35.409230 4965 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 25 15:04:35 crc kubenswrapper[4965]: I1125 15:04:35.875122 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:35 crc kubenswrapper[4965]: I1125 15:04:35.875953 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:35 crc kubenswrapper[4965]: I1125 15:04:35.876020 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:35 crc kubenswrapper[4965]: I1125 15:04:35.876033 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:36 crc kubenswrapper[4965]: I1125 15:04:36.397495 4965 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 15:04:36 crc kubenswrapper[4965]: I1125 15:04:36.397569 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.011530 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.013011 4965 trace.go:236] Trace[439871344]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 15:04:25.289) (total time: 11723ms): Nov 25 15:04:37 crc kubenswrapper[4965]: Trace[439871344]: ---"Objects listed" error: 11723ms (15:04:37.012) Nov 25 15:04:37 crc kubenswrapper[4965]: Trace[439871344]: [11.723678906s] [11.723678906s] END Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.013038 4965 trace.go:236] Trace[61206101]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 15:04:25.822) (total time: 11190ms): Nov 25 15:04:37 crc kubenswrapper[4965]: Trace[61206101]: ---"Objects listed" error: 11190ms (15:04:37.012) Nov 25 15:04:37 crc kubenswrapper[4965]: Trace[61206101]: [11.19014342s] [11.19014342s] END Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.013051 4965 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.013055 4965 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.013227 4965 trace.go:236] Trace[996714867]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 15:04:25.932) (total time: 11080ms): Nov 25 15:04:37 crc kubenswrapper[4965]: Trace[996714867]: ---"Objects listed" error: 11080ms (15:04:37.013) Nov 25 15:04:37 crc kubenswrapper[4965]: Trace[996714867]: [11.080518016s] [11.080518016s] END Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.013242 4965 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.015119 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.016192 4965 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.659045 4965 apiserver.go:52] "Watching apiserver" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.662175 4965 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.662517 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.662990 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.663045 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.663190 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.663417 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.663443 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.663487 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.663507 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.663540 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.663779 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.667744 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.668032 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.668167 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.668285 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.669120 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.669300 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.669446 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.672015 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.678659 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.706391 4965 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.709326 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719406 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719438 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719455 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719477 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719499 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719515 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719529 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719545 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719560 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719574 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719589 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719603 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719623 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719639 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719654 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719670 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719695 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719713 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719729 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719743 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719757 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719771 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719786 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719800 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719813 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719828 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719846 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719860 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719874 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719891 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719906 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719920 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719934 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719950 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719982 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.719997 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720012 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720027 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720042 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720055 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720072 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720087 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720104 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720119 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720134 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720148 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720165 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720178 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720192 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720207 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720222 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720237 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720267 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720282 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720297 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720311 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720481 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720779 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.720998 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.721018 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.721214 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.721348 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.721491 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.721490 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.721012 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.721588 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.721821 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.721849 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722014 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722025 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722052 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722069 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722177 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.725477 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722205 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.725709 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722259 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.725729 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722275 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722302 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722422 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722460 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722468 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722483 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722542 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.725793 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722623 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722650 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722683 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722727 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722732 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722891 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.722941 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723018 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723043 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723156 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723207 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723240 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723285 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723365 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723403 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.725857 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723563 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.725747 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723601 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723608 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723658 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.725897 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723685 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726072 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726001 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726114 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726136 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726154 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726169 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726184 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726212 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726228 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726247 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726261 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726276 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723795 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.724510 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.725497 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.725605 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726347 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.723600 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.725957 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726463 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726301 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726397 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726593 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726612 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726628 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726643 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726659 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726664 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726675 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726691 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726705 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726718 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726733 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726748 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726762 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726777 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726779 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726793 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726841 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726851 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.726858 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727154 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727183 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727221 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727244 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727267 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727308 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727309 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727313 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727328 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727362 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727386 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727397 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727406 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727469 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727684 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727730 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727753 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727756 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727778 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727840 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727867 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.727911 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.728132 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.728314 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.728650 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.728686 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729033 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729101 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729104 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729153 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729175 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729141 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729228 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729263 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730089 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730129 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730149 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730167 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730185 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730201 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730216 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730231 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730248 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730264 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730280 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730296 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730379 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730396 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730412 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730428 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730444 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730461 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730476 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730491 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730509 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730523 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730539 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730556 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730572 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730589 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730604 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730621 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730636 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730651 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730668 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730689 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730718 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730738 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730756 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730780 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730796 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730812 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730831 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730848 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730867 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730884 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730900 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730919 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730936 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730951 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730995 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731018 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731041 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731061 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731083 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731105 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731125 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731144 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731163 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731182 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731202 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731222 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731241 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731258 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731286 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731305 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731326 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731346 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731366 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731384 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731404 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731425 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731446 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731465 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731485 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731503 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731522 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731542 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731562 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731581 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731601 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731622 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731642 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731662 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731682 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731701 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731721 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731743 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731775 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731795 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731817 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731860 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731886 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731908 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731928 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731953 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.731995 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732024 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732044 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732065 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732084 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732105 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732125 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732148 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732173 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732252 4965 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732269 4965 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732283 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732296 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732307 4965 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732316 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732327 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732337 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732347 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732357 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732366 4965 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732376 4965 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732385 4965 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732394 4965 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732403 4965 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732413 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732423 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732432 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732442 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732452 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732462 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732471 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732480 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732506 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732517 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732527 4965 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732537 4965 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732547 4965 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732559 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732568 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732578 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732586 4965 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732596 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732605 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732614 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732624 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732633 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732642 4965 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732651 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732659 4965 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732671 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732681 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732690 4965 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732699 4965 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732709 4965 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732718 4965 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732727 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732755 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732765 4965 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732774 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732782 4965 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732791 4965 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732801 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732811 4965 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732820 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732829 4965 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732837 4965 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732847 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732856 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732864 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732876 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732885 4965 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732893 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732902 4965 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.732910 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.737549 4965 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729261 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729405 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729518 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729874 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.729913 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730058 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730134 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730328 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730404 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730648 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730661 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730779 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.730937 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.737236 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.737243 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.737377 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.737414 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.737463 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.737725 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.737730 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.737959 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.738041 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.738287 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.739150 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.739406 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.739418 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.740703 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.743414 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:04:38.243386079 +0000 UTC m=+23.210979825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.743498 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.744700 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.744884 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.747602 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.747888 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.747929 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.748479 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.748924 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.749152 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.749206 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.749486 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.749659 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.749921 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.750441 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.750751 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.751438 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.752403 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.752517 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.753121 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.753329 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.754685 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.754826 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.755244 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.755606 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.756530 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.756553 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.756584 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:38.25657029 +0000 UTC m=+23.224164036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.756628 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.756677 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:38.256659453 +0000 UTC m=+23.224253199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.772227 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.772831 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.773515 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.774861 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.775173 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.775308 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.775488 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.775883 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.776074 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.776140 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.776189 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.776336 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.776359 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.776370 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.776406 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.776420 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:38.276405694 +0000 UTC m=+23.243999440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.776474 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.776486 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.776619 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.776707 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.776744 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.776923 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.777213 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.777520 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.777742 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.777914 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.778148 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.745769 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.778548 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.745813 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.779038 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.779228 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.779368 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.779495 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.779773 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.782034 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.782482 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.784471 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.784653 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.784832 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.785006 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.785635 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.786327 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.786750 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.786925 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.787101 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.787121 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.787357 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.787513 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.787682 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.787697 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.787786 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.787834 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.788072 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.789557 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.790664 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.792068 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.792220 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.792330 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.792435 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.792443 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.792516 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.793440 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.793750 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.793865 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.794021 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.794276 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.794172 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.794677 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.796482 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.797185 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.797433 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.797546 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.797709 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.798000 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.798253 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.802781 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.802901 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.802987 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.803096 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:38.303078817 +0000 UTC m=+23.270672553 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.803152 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.808227 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.815603 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.817724 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.820213 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.827306 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.833740 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.833837 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.833859 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.833903 4965 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.833916 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.833924 4965 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.833934 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.833942 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.833950 4965 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834061 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834081 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834098 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834109 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834112 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834120 4965 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834148 4965 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834160 4965 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834170 4965 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834181 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834191 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834202 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834214 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834226 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834236 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834246 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834258 4965 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834269 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834280 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834290 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834301 4965 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834310 4965 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834319 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834330 4965 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834341 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834352 4965 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834362 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834374 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834384 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834394 4965 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834405 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834416 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834428 4965 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834438 4965 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834447 4965 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834455 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834464 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834472 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834480 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834488 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834496 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834504 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834513 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834520 4965 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834527 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834536 4965 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834544 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834552 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834560 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834567 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834575 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834582 4965 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834592 4965 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834601 4965 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834611 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834620 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834629 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834637 4965 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834645 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834653 4965 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834660 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834668 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834675 4965 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834700 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834710 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834718 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834726 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834734 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834742 4965 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834749 4965 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834758 4965 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834766 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834775 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834783 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834791 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834799 4965 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834806 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834814 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834822 4965 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834830 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834838 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834855 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834863 4965 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834872 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834879 4965 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834892 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834900 4965 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834907 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834915 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834923 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834931 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834939 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834946 4965 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834954 4965 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834975 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834984 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.834995 4965 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835005 4965 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835017 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835026 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835037 4965 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835046 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835054 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835062 4965 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835070 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835078 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835088 4965 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835095 4965 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835102 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835110 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835118 4965 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835125 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835133 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835140 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835147 4965 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835155 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835170 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835177 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835185 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835194 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835204 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835215 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835225 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835236 4965 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.835245 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.847413 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.859497 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.875447 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.885420 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.886021 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.888117 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3" exitCode=255 Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.888161 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3"} Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.888239 4965 scope.go:117] "RemoveContainer" containerID="48c771bd20dbd5a19b758162af3912cbb8238aac32e0e3d45c1994aaf053af66" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.891917 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.904058 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.945279 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.969497 4965 scope.go:117] "RemoveContainer" containerID="0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3" Nov 25 15:04:37 crc kubenswrapper[4965]: E1125 15:04:37.969674 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.969715 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.974593 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.976667 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.983433 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.990910 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.991309 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:04:37 crc kubenswrapper[4965]: I1125 15:04:37.998084 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.011472 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.020873 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.046382 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.070207 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.086244 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.100392 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c771bd20dbd5a19b758162af3912cbb8238aac32e0e3d45c1994aaf053af66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:30Z\\\",\\\"message\\\":\\\"W1125 15:04:19.972905 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:04:19.973419 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764083059 cert, and key in /tmp/serving-cert-3312498416/serving-signer.crt, /tmp/serving-cert-3312498416/serving-signer.key\\\\nI1125 15:04:20.371753 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:04:20.374539 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:04:20.374808 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:20.377764 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3312498416/tls.crt::/tmp/serving-cert-3312498416/tls.key\\\\\\\"\\\\nF1125 15:04:30.902788 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.112274 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.123038 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.135381 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.144101 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.152394 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.162906 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.171923 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.342674 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.342763 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.342805 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.342830 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:04:39.342802553 +0000 UTC m=+24.310396299 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.342889 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.342926 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.342946 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343019 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:39.342996498 +0000 UTC m=+24.310590264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343091 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343162 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:39.343124472 +0000 UTC m=+24.310718268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343165 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343188 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343199 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343209 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343248 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343259 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343236 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:39.343228335 +0000 UTC m=+24.310822081 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.343332 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:39.343311157 +0000 UTC m=+24.310904893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.445921 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-x42s2"] Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.446308 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-qtwc9"] Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.446711 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.446799 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8jdpp"] Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.446936 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-wpjkp"] Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.447189 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wpjkp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.447777 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.447826 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.449220 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.451248 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.451615 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.451833 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.451961 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.452184 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.452474 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.452495 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.452591 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.452656 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.452727 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.452773 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.452736 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.452808 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.453022 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.465950 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.477384 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.486738 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.495523 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.506152 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.515058 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.527903 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c771bd20dbd5a19b758162af3912cbb8238aac32e0e3d45c1994aaf053af66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:30Z\\\",\\\"message\\\":\\\"W1125 15:04:19.972905 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:04:19.973419 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764083059 cert, and key in /tmp/serving-cert-3312498416/serving-signer.crt, /tmp/serving-cert-3312498416/serving-signer.key\\\\nI1125 15:04:20.371753 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:04:20.374539 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:04:20.374808 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:20.377764 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3312498416/tls.crt::/tmp/serving-cert-3312498416/tls.key\\\\\\\"\\\\nF1125 15:04:30.902788 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.535480 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544137 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544237 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7ab112c4-45b9-468b-aa31-93b4f3c7444d-rootfs\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544266 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-cnibin\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544282 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-run-k8s-cni-cncf-io\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544312 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-etc-kubernetes\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544364 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll7w8\" (UniqueName: \"kubernetes.io/projected/e2b74ddb-bd2c-4b2d-a70e-9271305a70d7-kube-api-access-ll7w8\") pod \"node-resolver-wpjkp\" (UID: \"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\") " pod="openshift-dns/node-resolver-wpjkp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544399 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-os-release\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544425 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e2b74ddb-bd2c-4b2d-a70e-9271305a70d7-hosts-file\") pod \"node-resolver-wpjkp\" (UID: \"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\") " pod="openshift-dns/node-resolver-wpjkp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544464 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k77wb\" (UniqueName: \"kubernetes.io/projected/7ab112c4-45b9-468b-aa31-93b4f3c7444d-kube-api-access-k77wb\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544481 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-multus-conf-dir\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544498 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7de2930c-eabd-4919-b214-30b0c83141f7-multus-daemon-config\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544513 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7p2j\" (UniqueName: \"kubernetes.io/projected/7de2930c-eabd-4919-b214-30b0c83141f7-kube-api-access-r7p2j\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544545 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-system-cni-dir\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544564 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/32470785-6a9e-4ab4-bd44-a585e188fa99-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544597 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qbx\" (UniqueName: \"kubernetes.io/projected/32470785-6a9e-4ab4-bd44-a585e188fa99-kube-api-access-m6qbx\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544649 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-system-cni-dir\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544667 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-run-multus-certs\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544703 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-var-lib-cni-bin\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544778 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32470785-6a9e-4ab4-bd44-a585e188fa99-cni-binary-copy\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544799 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544815 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-multus-cni-dir\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544829 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-run-netns\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544877 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-hostroot\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.544915 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7de2930c-eabd-4919-b214-30b0c83141f7-cni-binary-copy\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.545014 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-var-lib-cni-multus\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.545031 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-os-release\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.545083 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-multus-socket-dir-parent\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.545100 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-var-lib-kubelet\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.545114 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ab112c4-45b9-468b-aa31-93b4f3c7444d-proxy-tls\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.545169 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ab112c4-45b9-468b-aa31-93b4f3c7444d-mcd-auth-proxy-config\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.545193 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-cnibin\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.553318 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c771bd20dbd5a19b758162af3912cbb8238aac32e0e3d45c1994aaf053af66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:30Z\\\",\\\"message\\\":\\\"W1125 15:04:19.972905 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:04:19.973419 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764083059 cert, and key in /tmp/serving-cert-3312498416/serving-signer.crt, /tmp/serving-cert-3312498416/serving-signer.key\\\\nI1125 15:04:20.371753 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:04:20.374539 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:04:20.374808 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:20.377764 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3312498416/tls.crt::/tmp/serving-cert-3312498416/tls.key\\\\\\\"\\\\nF1125 15:04:30.902788 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.562145 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.571064 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.578067 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.588940 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.598160 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.607708 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.615124 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.627242 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.636212 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.644600 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645773 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7ab112c4-45b9-468b-aa31-93b4f3c7444d-rootfs\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645814 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-cnibin\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645839 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-run-k8s-cni-cncf-io\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645860 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-etc-kubernetes\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645884 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll7w8\" (UniqueName: \"kubernetes.io/projected/e2b74ddb-bd2c-4b2d-a70e-9271305a70d7-kube-api-access-ll7w8\") pod \"node-resolver-wpjkp\" (UID: \"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\") " pod="openshift-dns/node-resolver-wpjkp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645917 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-os-release\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645937 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e2b74ddb-bd2c-4b2d-a70e-9271305a70d7-hosts-file\") pod \"node-resolver-wpjkp\" (UID: \"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\") " pod="openshift-dns/node-resolver-wpjkp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645935 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7ab112c4-45b9-468b-aa31-93b4f3c7444d-rootfs\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645994 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-run-k8s-cni-cncf-io\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645981 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-cnibin\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.645959 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k77wb\" (UniqueName: \"kubernetes.io/projected/7ab112c4-45b9-468b-aa31-93b4f3c7444d-kube-api-access-k77wb\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646099 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-etc-kubernetes\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646101 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7de2930c-eabd-4919-b214-30b0c83141f7-multus-daemon-config\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646137 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7p2j\" (UniqueName: \"kubernetes.io/projected/7de2930c-eabd-4919-b214-30b0c83141f7-kube-api-access-r7p2j\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646153 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-multus-conf-dir\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646168 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/32470785-6a9e-4ab4-bd44-a585e188fa99-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646177 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e2b74ddb-bd2c-4b2d-a70e-9271305a70d7-hosts-file\") pod \"node-resolver-wpjkp\" (UID: \"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\") " pod="openshift-dns/node-resolver-wpjkp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646183 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qbx\" (UniqueName: \"kubernetes.io/projected/32470785-6a9e-4ab4-bd44-a585e188fa99-kube-api-access-m6qbx\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646214 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-system-cni-dir\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646230 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-run-multus-certs\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646246 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-system-cni-dir\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646245 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-os-release\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646262 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-var-lib-cni-bin\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646279 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32470785-6a9e-4ab4-bd44-a585e188fa99-cni-binary-copy\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646289 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-multus-conf-dir\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646296 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646312 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-multus-cni-dir\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646327 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-run-netns\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646341 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-hostroot\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646385 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7de2930c-eabd-4919-b214-30b0c83141f7-cni-binary-copy\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646399 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-var-lib-cni-multus\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646417 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-os-release\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646431 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-multus-socket-dir-parent\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646446 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-var-lib-kubelet\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646462 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ab112c4-45b9-468b-aa31-93b4f3c7444d-mcd-auth-proxy-config\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646492 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ab112c4-45b9-468b-aa31-93b4f3c7444d-proxy-tls\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646512 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-cnibin\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646553 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-cnibin\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646560 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-run-netns\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646579 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-hostroot\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646659 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-system-cni-dir\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646684 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-run-multus-certs\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646711 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-system-cni-dir\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.646753 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-var-lib-cni-bin\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.647014 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/32470785-6a9e-4ab4-bd44-a585e188fa99-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.647014 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7de2930c-eabd-4919-b214-30b0c83141f7-multus-daemon-config\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.647071 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-multus-socket-dir-parent\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.647080 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7de2930c-eabd-4919-b214-30b0c83141f7-cni-binary-copy\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.647369 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-var-lib-cni-multus\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.647453 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-os-release\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.647997 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ab112c4-45b9-468b-aa31-93b4f3c7444d-mcd-auth-proxy-config\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.648185 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-multus-cni-dir\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.648241 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7de2930c-eabd-4919-b214-30b0c83141f7-host-var-lib-kubelet\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.648393 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32470785-6a9e-4ab4-bd44-a585e188fa99-cni-binary-copy\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.651337 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32470785-6a9e-4ab4-bd44-a585e188fa99-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.658134 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ab112c4-45b9-468b-aa31-93b4f3c7444d-proxy-tls\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.662324 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.665917 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll7w8\" (UniqueName: \"kubernetes.io/projected/e2b74ddb-bd2c-4b2d-a70e-9271305a70d7-kube-api-access-ll7w8\") pod \"node-resolver-wpjkp\" (UID: \"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\") " pod="openshift-dns/node-resolver-wpjkp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.668017 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k77wb\" (UniqueName: \"kubernetes.io/projected/7ab112c4-45b9-468b-aa31-93b4f3c7444d-kube-api-access-k77wb\") pod \"machine-config-daemon-x42s2\" (UID: \"7ab112c4-45b9-468b-aa31-93b4f3c7444d\") " pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.668662 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qbx\" (UniqueName: \"kubernetes.io/projected/32470785-6a9e-4ab4-bd44-a585e188fa99-kube-api-access-m6qbx\") pod \"multus-additional-cni-plugins-qtwc9\" (UID: \"32470785-6a9e-4ab4-bd44-a585e188fa99\") " pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.671173 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7p2j\" (UniqueName: \"kubernetes.io/projected/7de2930c-eabd-4919-b214-30b0c83141f7-kube-api-access-r7p2j\") pod \"multus-8jdpp\" (UID: \"7de2930c-eabd-4919-b214-30b0c83141f7\") " pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.761519 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.768349 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wpjkp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.775767 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.776593 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.777836 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.778403 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.778546 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.779631 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.780199 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.781071 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.782098 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.783136 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.783645 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.784289 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.785523 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8jdpp" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.785851 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.787739 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.790152 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.791080 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.799511 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.802070 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.807361 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.808469 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.809236 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.811416 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.814648 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.815206 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.816562 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.817235 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.818537 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.819325 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.820444 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.821225 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.822841 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.823429 4965 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.823559 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.825958 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.827067 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.827661 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.829297 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.829937 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.830827 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.831474 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.832547 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.832995 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.833995 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.834602 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.835864 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.836391 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.836908 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.837825 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.838508 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.839523 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.839999 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.840906 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.841629 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.842179 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.842634 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.843475 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-58mtl"] Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.844163 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.846360 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.846394 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.846620 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.846655 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.846823 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.846882 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.847083 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.858928 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.878454 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.891057 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"706679d21f1139decdca502dfc2333f5b2e428344e33187c1ed5ddde0a86643e"} Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.891668 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fc295dd2b7c5d13abb1f308b7d91751a53b82634549bfbbb79f5d84c42587902"} Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.892926 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a"} Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.892948 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0"} Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.892958 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e8d54fd535f657473dd840c5afcf2f1f2f8173d1a3c13e5cb12daf77773a2f1e"} Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.893931 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.894671 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332"} Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.894698 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"98f49e66a763af5d7f752db846bf4f74329c31c3d347dc7acf14d86a1a19e0dc"} Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.895625 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8jdpp" event={"ID":"7de2930c-eabd-4919-b214-30b0c83141f7","Type":"ContainerStarted","Data":"cfc04186faa2a3a7419de1aba560e19e99c6bca766998695ff1a3246e56beca2"} Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.897247 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" event={"ID":"32470785-6a9e-4ab4-bd44-a585e188fa99","Type":"ContainerStarted","Data":"3e1363201055699620da0673df961179e1761721850d984b9cc213382fc026fa"} Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.898238 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.904017 4965 scope.go:117] "RemoveContainer" containerID="0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3" Nov 25 15:04:38 crc kubenswrapper[4965]: E1125 15:04:38.904207 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.905148 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wpjkp" event={"ID":"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7","Type":"ContainerStarted","Data":"334afe709245dd20b22591899effb9a1688d0bb7b5880ef0c274b66698cd21cf"} Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.912819 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.932516 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.946284 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948474 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948499 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-config\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948524 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-log-socket\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948538 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-env-overrides\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948553 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948566 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eea3820a-3f97-48a7-8b49-def506fe71e2-ovn-node-metrics-cert\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948589 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-etc-openvswitch\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948601 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-bin\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948614 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-openvswitch\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948631 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjnz\" (UniqueName: \"kubernetes.io/projected/eea3820a-3f97-48a7-8b49-def506fe71e2-kube-api-access-9mjnz\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948651 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-slash\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948664 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-script-lib\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948688 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-netns\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948701 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-node-log\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948728 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-systemd-units\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948742 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-netd\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948772 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-systemd\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948787 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-var-lib-openvswitch\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948807 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-ovn\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.948823 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-kubelet\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.958921 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.973901 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:38 crc kubenswrapper[4965]: I1125 15:04:38.988588 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c771bd20dbd5a19b758162af3912cbb8238aac32e0e3d45c1994aaf053af66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:30Z\\\",\\\"message\\\":\\\"W1125 15:04:19.972905 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:04:19.973419 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764083059 cert, and key in /tmp/serving-cert-3312498416/serving-signer.crt, /tmp/serving-cert-3312498416/serving-signer.key\\\\nI1125 15:04:20.371753 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:04:20.374539 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:04:20.374808 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:20.377764 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3312498416/tls.crt::/tmp/serving-cert-3312498416/tls.key\\\\\\\"\\\\nF1125 15:04:30.902788 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.003620 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.019921 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.032811 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.049844 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-kubelet\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.049905 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.049930 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-config\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.049932 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-kubelet\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050021 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050032 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-log-socket\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.049947 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-log-socket\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050113 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050136 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-env-overrides\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050159 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eea3820a-3f97-48a7-8b49-def506fe71e2-ovn-node-metrics-cert\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050184 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-etc-openvswitch\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050186 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050207 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-bin\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050232 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-openvswitch\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050267 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjnz\" (UniqueName: \"kubernetes.io/projected/eea3820a-3f97-48a7-8b49-def506fe71e2-kube-api-access-9mjnz\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050291 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-bin\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050292 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-slash\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050317 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-slash\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050349 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-netns\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050370 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-node-log\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050388 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-script-lib\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050422 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-systemd-units\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050440 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-netd\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050462 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-systemd\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050484 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-var-lib-openvswitch\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050516 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-ovn\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050266 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-etc-openvswitch\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050564 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-ovn\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050593 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-systemd-units\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050622 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-netd\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050643 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-env-overrides\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050671 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-var-lib-openvswitch\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050648 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-systemd\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050348 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-openvswitch\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050703 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-netns\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050722 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-node-log\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.050735 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-config\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.051103 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-script-lib\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.053723 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eea3820a-3f97-48a7-8b49-def506fe71e2-ovn-node-metrics-cert\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.055270 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.068691 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.076561 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjnz\" (UniqueName: \"kubernetes.io/projected/eea3820a-3f97-48a7-8b49-def506fe71e2-kube-api-access-9mjnz\") pod \"ovnkube-node-58mtl\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.089103 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.103445 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.119287 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.156462 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.180748 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.191520 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: W1125 15:04:39.192843 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeea3820a_3f97_48a7_8b49_def506fe71e2.slice/crio-af25b7e7f5d56b7885f8f972888ff2580308857c14133f3483a1e83e2d2684af WatchSource:0}: Error finding container af25b7e7f5d56b7885f8f972888ff2580308857c14133f3483a1e83e2d2684af: Status 404 returned error can't find the container with id af25b7e7f5d56b7885f8f972888ff2580308857c14133f3483a1e83e2d2684af Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.239938 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.270915 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.310824 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.351051 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.352278 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.352373 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.352414 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.352443 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.352466 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.352560 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.352623 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:41.352607326 +0000 UTC m=+26.320201072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.352912 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:04:41.352904025 +0000 UTC m=+26.320497771 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.352950 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.352999 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:41.352990217 +0000 UTC m=+26.320583963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.353053 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.353067 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.353078 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.353107 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:41.35309959 +0000 UTC m=+26.320693336 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.353152 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.353166 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.353174 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.353200 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:41.353193363 +0000 UTC m=+26.320787109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.378137 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.390211 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.428064 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.469940 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.770957 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.771094 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.770988 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.771158 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.770952 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.771204 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.908273 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8jdpp" event={"ID":"7de2930c-eabd-4919-b214-30b0c83141f7","Type":"ContainerStarted","Data":"af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99"} Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.910152 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" event={"ID":"32470785-6a9e-4ab4-bd44-a585e188fa99","Type":"ContainerDied","Data":"07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e"} Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.910203 4965 generic.go:334] "Generic (PLEG): container finished" podID="32470785-6a9e-4ab4-bd44-a585e188fa99" containerID="07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e" exitCode=0 Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.911815 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wpjkp" event={"ID":"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7","Type":"ContainerStarted","Data":"9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff"} Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.913255 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7"} Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.913300 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e"} Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.914521 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434" exitCode=0 Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.914615 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434"} Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.914662 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"af25b7e7f5d56b7885f8f972888ff2580308857c14133f3483a1e83e2d2684af"} Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.915557 4965 scope.go:117] "RemoveContainer" containerID="0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3" Nov 25 15:04:39 crc kubenswrapper[4965]: E1125 15:04:39.916197 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.932583 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.950498 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.964317 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:39 crc kubenswrapper[4965]: I1125 15:04:39.980437 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.002810 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.017079 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.031808 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.046995 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.061857 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.078555 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.104454 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.121707 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.138869 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.156723 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.179728 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.192896 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.208079 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.221686 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.233386 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.269272 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.308703 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.349902 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.392847 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.429702 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.470643 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.516427 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.721608 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.739312 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.740627 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.743431 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.764740 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.781105 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.794060 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.809064 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.821800 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.840751 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.851835 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.890177 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.922115 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63"} Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.922161 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9"} Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.922177 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11"} Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.922187 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5"} Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.922198 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201"} Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.923542 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056"} Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.925187 4965 generic.go:334] "Generic (PLEG): container finished" podID="32470785-6a9e-4ab4-bd44-a585e188fa99" containerID="e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492" exitCode=0 Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.926221 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" event={"ID":"32470785-6a9e-4ab4-bd44-a585e188fa99","Type":"ContainerDied","Data":"e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492"} Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.935082 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:40 crc kubenswrapper[4965]: I1125 15:04:40.969025 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.012591 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.057395 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.092899 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.128322 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.169024 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.208510 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.248861 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.297106 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.328988 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.370700 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.370775 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.370800 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.370820 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.370866 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:04:45.370845445 +0000 UTC m=+30.338439191 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.370902 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.370907 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.370907 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371000 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:45.370991529 +0000 UTC m=+30.338585275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371017 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371065 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371078 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371132 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:45.371113384 +0000 UTC m=+30.338707190 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371017 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371157 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371166 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371195 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:45.371186456 +0000 UTC m=+30.338780312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371020 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.371230 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:45.371223057 +0000 UTC m=+30.338816893 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.408129 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.445316 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-59czm"] Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.445712 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.452898 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.461261 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.480922 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.500775 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.521934 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.569305 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.572667 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fxgd\" (UniqueName: \"kubernetes.io/projected/eec84672-2d9f-4b5a-9d5d-514bf609b63d-kube-api-access-8fxgd\") pod \"node-ca-59czm\" (UID: \"eec84672-2d9f-4b5a-9d5d-514bf609b63d\") " pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.572708 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eec84672-2d9f-4b5a-9d5d-514bf609b63d-host\") pod \"node-ca-59czm\" (UID: \"eec84672-2d9f-4b5a-9d5d-514bf609b63d\") " pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.572791 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eec84672-2d9f-4b5a-9d5d-514bf609b63d-serviceca\") pod \"node-ca-59czm\" (UID: \"eec84672-2d9f-4b5a-9d5d-514bf609b63d\") " pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.610839 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.660615 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.673932 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fxgd\" (UniqueName: \"kubernetes.io/projected/eec84672-2d9f-4b5a-9d5d-514bf609b63d-kube-api-access-8fxgd\") pod \"node-ca-59czm\" (UID: \"eec84672-2d9f-4b5a-9d5d-514bf609b63d\") " pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.673984 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eec84672-2d9f-4b5a-9d5d-514bf609b63d-host\") pod \"node-ca-59czm\" (UID: \"eec84672-2d9f-4b5a-9d5d-514bf609b63d\") " pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.674019 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eec84672-2d9f-4b5a-9d5d-514bf609b63d-serviceca\") pod \"node-ca-59czm\" (UID: \"eec84672-2d9f-4b5a-9d5d-514bf609b63d\") " pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.674129 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eec84672-2d9f-4b5a-9d5d-514bf609b63d-host\") pod \"node-ca-59czm\" (UID: \"eec84672-2d9f-4b5a-9d5d-514bf609b63d\") " pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.674934 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eec84672-2d9f-4b5a-9d5d-514bf609b63d-serviceca\") pod \"node-ca-59czm\" (UID: \"eec84672-2d9f-4b5a-9d5d-514bf609b63d\") " pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.688360 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.719805 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fxgd\" (UniqueName: \"kubernetes.io/projected/eec84672-2d9f-4b5a-9d5d-514bf609b63d-kube-api-access-8fxgd\") pod \"node-ca-59czm\" (UID: \"eec84672-2d9f-4b5a-9d5d-514bf609b63d\") " pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.750878 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.756828 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-59czm" Nov 25 15:04:41 crc kubenswrapper[4965]: W1125 15:04:41.768541 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeec84672_2d9f_4b5a_9d5d_514bf609b63d.slice/crio-5b86a7eae816ff450dcca719ae241dd31c352bf2261c384af536af5fff8b4b6b WatchSource:0}: Error finding container 5b86a7eae816ff450dcca719ae241dd31c352bf2261c384af536af5fff8b4b6b: Status 404 returned error can't find the container with id 5b86a7eae816ff450dcca719ae241dd31c352bf2261c384af536af5fff8b4b6b Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.770460 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.770475 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.770554 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.770456 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.770706 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:41 crc kubenswrapper[4965]: E1125 15:04:41.770813 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.797819 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.828182 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.868640 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.909564 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.930859 4965 generic.go:334] "Generic (PLEG): container finished" podID="32470785-6a9e-4ab4-bd44-a585e188fa99" containerID="129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea" exitCode=0 Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.931019 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" event={"ID":"32470785-6a9e-4ab4-bd44-a585e188fa99","Type":"ContainerDied","Data":"129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea"} Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.931988 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-59czm" event={"ID":"eec84672-2d9f-4b5a-9d5d-514bf609b63d","Type":"ContainerStarted","Data":"5b86a7eae816ff450dcca719ae241dd31c352bf2261c384af536af5fff8b4b6b"} Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.937409 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2"} Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.951957 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:41 crc kubenswrapper[4965]: I1125 15:04:41.990858 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.028495 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.070070 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.113368 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.151092 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.195310 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.230512 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.270090 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.307490 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.349808 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.394545 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.427175 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.471838 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.513173 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.549279 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.596697 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.633035 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.670357 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.708885 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.755148 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.790618 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.832020 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.873389 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.912944 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.942521 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-59czm" event={"ID":"eec84672-2d9f-4b5a-9d5d-514bf609b63d","Type":"ContainerStarted","Data":"6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f"} Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.946525 4965 generic.go:334] "Generic (PLEG): container finished" podID="32470785-6a9e-4ab4-bd44-a585e188fa99" containerID="ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7" exitCode=0 Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.946572 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" event={"ID":"32470785-6a9e-4ab4-bd44-a585e188fa99","Type":"ContainerDied","Data":"ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7"} Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.966564 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:42 crc kubenswrapper[4965]: I1125 15:04:42.995581 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.029548 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.073191 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.118075 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.148225 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.189582 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.238577 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.275308 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.311792 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.351593 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.389025 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.400617 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.404830 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.415178 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.416653 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.416913 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.416927 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.417041 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.429271 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.482039 4965 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.482290 4965 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.483304 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.483347 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.483356 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.483372 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.483381 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:43Z","lastTransitionTime":"2025-11-25T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:43 crc kubenswrapper[4965]: E1125 15:04:43.498309 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.501195 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.501227 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.501238 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.501254 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.501265 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:43Z","lastTransitionTime":"2025-11-25T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.508268 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: E1125 15:04:43.511944 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.518436 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.518473 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.518482 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.518497 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.518506 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:43Z","lastTransitionTime":"2025-11-25T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:43 crc kubenswrapper[4965]: E1125 15:04:43.529900 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.532992 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.533018 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.533026 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.533039 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.533048 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:43Z","lastTransitionTime":"2025-11-25T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:43 crc kubenswrapper[4965]: E1125 15:04:43.544338 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.547784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.547842 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.547865 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.547884 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.547895 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:43Z","lastTransitionTime":"2025-11-25T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.550526 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: E1125 15:04:43.560018 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: E1125 15:04:43.560152 4965 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.561856 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.561894 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.561905 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.561921 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.561932 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:43Z","lastTransitionTime":"2025-11-25T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.590933 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.628111 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.664084 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.664111 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.664119 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.664131 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.664140 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:43Z","lastTransitionTime":"2025-11-25T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.673034 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.708784 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.748751 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.766836 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.766864 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.766880 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.766894 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.766902 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:43Z","lastTransitionTime":"2025-11-25T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.771053 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.771109 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:43 crc kubenswrapper[4965]: E1125 15:04:43.771187 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.771209 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:43 crc kubenswrapper[4965]: E1125 15:04:43.771295 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:43 crc kubenswrapper[4965]: E1125 15:04:43.771363 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.789805 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.838712 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.869319 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.869368 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.869380 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.869396 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.869407 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:43Z","lastTransitionTime":"2025-11-25T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.871555 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.911313 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.949274 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.952594 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a"} Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.955228 4965 generic.go:334] "Generic (PLEG): container finished" podID="32470785-6a9e-4ab4-bd44-a585e188fa99" containerID="34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73" exitCode=0 Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.955259 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" event={"ID":"32470785-6a9e-4ab4-bd44-a585e188fa99","Type":"ContainerDied","Data":"34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73"} Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.974049 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.974079 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.974089 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.974104 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.974114 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:43Z","lastTransitionTime":"2025-11-25T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:43 crc kubenswrapper[4965]: I1125 15:04:43.990101 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.029743 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.068704 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.076719 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.076755 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.076767 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.076785 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.076798 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:44Z","lastTransitionTime":"2025-11-25T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.109436 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.158547 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.180127 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.180176 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.180188 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.180206 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.180218 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:44Z","lastTransitionTime":"2025-11-25T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.190666 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.231106 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.275629 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.283305 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.283352 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.283363 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.283378 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.283388 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:44Z","lastTransitionTime":"2025-11-25T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.312814 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.352414 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.385758 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.385798 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.385812 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.385828 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.385840 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:44Z","lastTransitionTime":"2025-11-25T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.405580 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.431241 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.470345 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.488186 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.488458 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.489149 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.489477 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.489611 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:44Z","lastTransitionTime":"2025-11-25T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.509449 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.550112 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.592503 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.593033 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.593073 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.593087 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.593108 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.593125 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:44Z","lastTransitionTime":"2025-11-25T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.631301 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.669312 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.695849 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.695894 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.695905 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.695924 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.695936 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:44Z","lastTransitionTime":"2025-11-25T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.710592 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.752114 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.797763 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.797991 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.798074 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.798196 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.798283 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:44Z","lastTransitionTime":"2025-11-25T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.900534 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.900580 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.900590 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.900608 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.900620 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:44Z","lastTransitionTime":"2025-11-25T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:44 crc kubenswrapper[4965]: I1125 15:04:44.960403 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" event={"ID":"32470785-6a9e-4ab4-bd44-a585e188fa99","Type":"ContainerStarted","Data":"50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.002525 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.002590 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.002605 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.002627 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.002642 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:45Z","lastTransitionTime":"2025-11-25T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.105509 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.105554 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.105569 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.105589 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.105604 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:45Z","lastTransitionTime":"2025-11-25T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.208936 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.209291 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.209473 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.209626 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.209745 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:45Z","lastTransitionTime":"2025-11-25T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.312051 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.312120 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.312143 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.312172 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.312194 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:45Z","lastTransitionTime":"2025-11-25T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.410429 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.410542 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.410590 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.410630 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.410671 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410753 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410771 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410775 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410754 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:04:53.410715611 +0000 UTC m=+38.378309397 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410837 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410867 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410868 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:53.410851625 +0000 UTC m=+38.378445411 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410873 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410925 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:53.410905656 +0000 UTC m=+38.378499502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410955 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:53.410939657 +0000 UTC m=+38.378533443 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.410808 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.411006 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.411037 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:53.41102859 +0000 UTC m=+38.378622476 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.415503 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.415706 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.415814 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.415934 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.416129 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:45Z","lastTransitionTime":"2025-11-25T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.518988 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.519024 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.519035 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.519051 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.519063 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:45Z","lastTransitionTime":"2025-11-25T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.620765 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.620820 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.620832 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.620848 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.620860 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:45Z","lastTransitionTime":"2025-11-25T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.723549 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.723581 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.723590 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.723604 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.723613 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:45Z","lastTransitionTime":"2025-11-25T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.771120 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.771206 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.771251 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.771356 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.771120 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:45 crc kubenswrapper[4965]: E1125 15:04:45.771453 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.825784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.825816 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.825824 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.825838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.825848 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:45Z","lastTransitionTime":"2025-11-25T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.929024 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.929082 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.929097 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.929118 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.929130 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:45Z","lastTransitionTime":"2025-11-25T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.968452 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.969118 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.969164 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.969180 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.972119 4965 generic.go:334] "Generic (PLEG): container finished" podID="32470785-6a9e-4ab4-bd44-a585e188fa99" containerID="50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495" exitCode=0 Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.972149 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" event={"ID":"32470785-6a9e-4ab4-bd44-a585e188fa99","Type":"ContainerDied","Data":"50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495"} Nov 25 15:04:45 crc kubenswrapper[4965]: I1125 15:04:45.992409 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.009283 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.021674 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.033392 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.033436 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.033448 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.033466 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.033478 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:46Z","lastTransitionTime":"2025-11-25T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.044026 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.066443 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.134132 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.135294 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.135339 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.135351 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.135366 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.135377 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:46Z","lastTransitionTime":"2025-11-25T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.135629 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.136131 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.145483 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.159273 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.181260 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.194688 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.213476 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.228224 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.238359 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.238397 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.238406 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.238419 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.238430 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:46Z","lastTransitionTime":"2025-11-25T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.242613 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.251142 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.265029 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.278697 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.290341 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.301247 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.313343 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.327787 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.336488 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.340281 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.340310 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.340319 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.340332 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.340341 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:46Z","lastTransitionTime":"2025-11-25T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.350487 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.367744 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.390899 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.413036 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.425317 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.438659 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.442443 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.442490 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.442501 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.442518 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.442559 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:46Z","lastTransitionTime":"2025-11-25T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.448537 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.459609 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.476514 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.544798 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.544833 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.544842 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.544857 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.544892 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:46Z","lastTransitionTime":"2025-11-25T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.647654 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.647693 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.647704 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.647721 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.647732 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:46Z","lastTransitionTime":"2025-11-25T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.749406 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.749439 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.749447 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.749460 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.749469 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:46Z","lastTransitionTime":"2025-11-25T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.783814 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.795544 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.808456 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.827242 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.838829 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.851784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.851816 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.851830 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.851848 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.851861 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:46Z","lastTransitionTime":"2025-11-25T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.852810 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.865143 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.879497 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.891708 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.903770 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.925543 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.938678 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.952653 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.954078 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.954118 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.954128 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.954146 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.954157 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:46Z","lastTransitionTime":"2025-11-25T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.962807 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:46 crc kubenswrapper[4965]: I1125 15:04:46.975264 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.056571 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.056602 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.056611 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.056625 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.056634 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:47Z","lastTransitionTime":"2025-11-25T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.212551 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.213111 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.213197 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.213288 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.213369 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:47Z","lastTransitionTime":"2025-11-25T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.316375 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.316421 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.316453 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.316471 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.316485 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:47Z","lastTransitionTime":"2025-11-25T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.418784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.418825 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.418834 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.418848 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.418857 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:47Z","lastTransitionTime":"2025-11-25T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.520893 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.520930 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.520940 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.520954 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.520977 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:47Z","lastTransitionTime":"2025-11-25T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.623518 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.623558 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.623569 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.623586 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.623599 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:47Z","lastTransitionTime":"2025-11-25T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.726017 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.726051 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.726061 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.726074 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.726082 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:47Z","lastTransitionTime":"2025-11-25T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.770448 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.770519 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.770449 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:47 crc kubenswrapper[4965]: E1125 15:04:47.770575 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:47 crc kubenswrapper[4965]: E1125 15:04:47.770640 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:47 crc kubenswrapper[4965]: E1125 15:04:47.770696 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.828700 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.828738 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.828747 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.828762 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.828777 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:47Z","lastTransitionTime":"2025-11-25T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.931105 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.931139 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.931148 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.931161 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.931170 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:47Z","lastTransitionTime":"2025-11-25T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.980711 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" event={"ID":"32470785-6a9e-4ab4-bd44-a585e188fa99","Type":"ContainerStarted","Data":"5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2"} Nov 25 15:04:47 crc kubenswrapper[4965]: I1125 15:04:47.994448 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:47Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.006257 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.023171 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.033207 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.033255 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.033267 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.033285 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.033297 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:48Z","lastTransitionTime":"2025-11-25T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.039991 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.058760 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.070579 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.087940 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.107163 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.120961 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.133732 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.135026 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.135093 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.135106 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.135122 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.135136 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:48Z","lastTransitionTime":"2025-11-25T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.156007 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.169781 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.182719 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.194226 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.208643 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.237161 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.237207 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.237217 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.237233 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.237244 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:48Z","lastTransitionTime":"2025-11-25T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.339587 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.339616 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.339629 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.339645 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.339659 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:48Z","lastTransitionTime":"2025-11-25T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.442427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.442467 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.442479 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.442496 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.442507 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:48Z","lastTransitionTime":"2025-11-25T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.545258 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.545312 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.545328 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.545352 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.545370 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:48Z","lastTransitionTime":"2025-11-25T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.653431 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.653513 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.653540 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.653570 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.653590 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:48Z","lastTransitionTime":"2025-11-25T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.755847 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.755894 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.755905 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.755923 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.755934 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:48Z","lastTransitionTime":"2025-11-25T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.858176 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.858226 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.858240 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.858258 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.858271 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:48Z","lastTransitionTime":"2025-11-25T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.961080 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.961125 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.961138 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.961155 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:48 crc kubenswrapper[4965]: I1125 15:04:48.961169 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:48Z","lastTransitionTime":"2025-11-25T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.063946 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.063991 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.064000 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.064015 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.064027 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:49Z","lastTransitionTime":"2025-11-25T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.166231 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.166263 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.166273 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.166289 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.166299 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:49Z","lastTransitionTime":"2025-11-25T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.268801 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.269330 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.269412 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.269518 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.269599 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:49Z","lastTransitionTime":"2025-11-25T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.371489 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.371519 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.371527 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.371540 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.371549 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:49Z","lastTransitionTime":"2025-11-25T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.473903 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.474136 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.474218 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.474298 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.474384 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:49Z","lastTransitionTime":"2025-11-25T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.577011 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.577285 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.577294 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.577306 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.577314 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:49Z","lastTransitionTime":"2025-11-25T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.679336 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.679370 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.679381 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.679396 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.679406 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:49Z","lastTransitionTime":"2025-11-25T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.771239 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.771303 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.771333 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:49 crc kubenswrapper[4965]: E1125 15:04:49.771366 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:49 crc kubenswrapper[4965]: E1125 15:04:49.771444 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:49 crc kubenswrapper[4965]: E1125 15:04:49.771504 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.781160 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.781192 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.781204 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.781218 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.781226 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:49Z","lastTransitionTime":"2025-11-25T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.883261 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.883289 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.883297 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.883312 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.883332 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:49Z","lastTransitionTime":"2025-11-25T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.987491 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.987534 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.987548 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.987567 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:49 crc kubenswrapper[4965]: I1125 15:04:49.987584 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:49Z","lastTransitionTime":"2025-11-25T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.089644 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.089833 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.089892 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.089997 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.090073 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:50Z","lastTransitionTime":"2025-11-25T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.193309 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.193363 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.193384 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.193407 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.193424 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:50Z","lastTransitionTime":"2025-11-25T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.296073 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.296367 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.296388 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.296406 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.296420 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:50Z","lastTransitionTime":"2025-11-25T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.398801 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.398852 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.398862 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.398878 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.398890 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:50Z","lastTransitionTime":"2025-11-25T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.501227 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.501272 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.501283 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.501302 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.501314 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:50Z","lastTransitionTime":"2025-11-25T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.604141 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.604232 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.604252 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.604756 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.604842 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:50Z","lastTransitionTime":"2025-11-25T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.653002 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj"] Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.653440 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.655894 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.661327 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.663168 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rpzk\" (UniqueName: \"kubernetes.io/projected/0a73fd66-1e46-4473-8508-a8cf24d51a04-kube-api-access-9rpzk\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.663358 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a73fd66-1e46-4473-8508-a8cf24d51a04-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.663453 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a73fd66-1e46-4473-8508-a8cf24d51a04-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.663529 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a73fd66-1e46-4473-8508-a8cf24d51a04-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.677424 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.708767 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.708811 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.708823 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.708842 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.708854 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:50Z","lastTransitionTime":"2025-11-25T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.711381 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.724690 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.739677 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.754121 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.764321 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a73fd66-1e46-4473-8508-a8cf24d51a04-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.764390 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a73fd66-1e46-4473-8508-a8cf24d51a04-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.764420 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a73fd66-1e46-4473-8508-a8cf24d51a04-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.764471 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rpzk\" (UniqueName: \"kubernetes.io/projected/0a73fd66-1e46-4473-8508-a8cf24d51a04-kube-api-access-9rpzk\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.765012 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a73fd66-1e46-4473-8508-a8cf24d51a04-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.765123 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a73fd66-1e46-4473-8508-a8cf24d51a04-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.771326 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.773428 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a73fd66-1e46-4473-8508-a8cf24d51a04-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.783554 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rpzk\" (UniqueName: \"kubernetes.io/projected/0a73fd66-1e46-4473-8508-a8cf24d51a04-kube-api-access-9rpzk\") pod \"ovnkube-control-plane-749d76644c-9t6rj\" (UID: \"0a73fd66-1e46-4473-8508-a8cf24d51a04\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.788698 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.800704 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.811459 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.811492 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.811504 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.811522 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.811534 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:50Z","lastTransitionTime":"2025-11-25T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.812528 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.825390 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.837979 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.852007 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.864835 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.875893 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.890099 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.912573 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.914249 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.914283 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.914299 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.914315 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.914325 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:50Z","lastTransitionTime":"2025-11-25T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.975787 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.995143 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/0.log" Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.997834 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c" exitCode=1 Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.997870 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c"} Nov 25 15:04:50 crc kubenswrapper[4965]: I1125 15:04:50.998405 4965 scope.go:117] "RemoveContainer" containerID="5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.014253 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.017518 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.017549 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.017560 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.017577 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.017589 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:51Z","lastTransitionTime":"2025-11-25T15:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.027112 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.040175 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.056506 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.072320 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:49.942082 6156 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:04:49.942097 6156 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:04:49.942109 6156 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:49.942114 6156 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:04:49.942177 6156 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:04:49.942188 6156 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:04:49.942199 6156 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:04:49.942218 6156 factory.go:656] Stopping watch factory\\\\nI1125 15:04:49.942231 6156 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:04:49.942243 6156 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:49.942250 6156 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:04:49.942256 6156 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:49.942262 6156 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:04:49.942269 6156 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:04:49.942281 6156 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.083577 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.104950 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.116715 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.120847 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.120884 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.120897 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.120914 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.120926 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:51Z","lastTransitionTime":"2025-11-25T15:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.130313 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.139173 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.153319 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.166371 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.184094 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.196551 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.213354 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.224571 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.224623 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.224637 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.224658 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.224672 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:51Z","lastTransitionTime":"2025-11-25T15:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.225676 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.326957 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.327012 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.327020 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.327034 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.327042 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:51Z","lastTransitionTime":"2025-11-25T15:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.430085 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.430130 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.430148 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.430171 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.430185 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:51Z","lastTransitionTime":"2025-11-25T15:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.533624 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.533678 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.533699 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.533724 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.533742 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:51Z","lastTransitionTime":"2025-11-25T15:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.636640 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.637140 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.637160 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.637185 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.637203 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:51Z","lastTransitionTime":"2025-11-25T15:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.739318 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.739363 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.739376 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.739403 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.739414 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:51Z","lastTransitionTime":"2025-11-25T15:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.770494 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:51 crc kubenswrapper[4965]: E1125 15:04:51.770616 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.771002 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:51 crc kubenswrapper[4965]: E1125 15:04:51.771080 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.771132 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:51 crc kubenswrapper[4965]: E1125 15:04:51.771190 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.841882 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.841914 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.841923 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.841937 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.841945 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:51Z","lastTransitionTime":"2025-11-25T15:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.944272 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.944299 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.944307 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.944320 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:51 crc kubenswrapper[4965]: I1125 15:04:51.944329 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:51Z","lastTransitionTime":"2025-11-25T15:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.003445 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/0.log" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.006373 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.006802 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.008426 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" event={"ID":"0a73fd66-1e46-4473-8508-a8cf24d51a04","Type":"ContainerStarted","Data":"c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.008559 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" event={"ID":"0a73fd66-1e46-4473-8508-a8cf24d51a04","Type":"ContainerStarted","Data":"712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.008641 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" event={"ID":"0a73fd66-1e46-4473-8508-a8cf24d51a04","Type":"ContainerStarted","Data":"a893710b4ddca9db7285c24441acbccc24a1628e32f5bfff3f4a65701a51ce56"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.028110 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.045212 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:49.942082 6156 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:04:49.942097 6156 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:04:49.942109 6156 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:49.942114 6156 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:04:49.942177 6156 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:04:49.942188 6156 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:04:49.942199 6156 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:04:49.942218 6156 factory.go:656] Stopping watch factory\\\\nI1125 15:04:49.942231 6156 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:04:49.942243 6156 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:49.942250 6156 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:04:49.942256 6156 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:49.942262 6156 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:04:49.942269 6156 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:04:49.942281 6156 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.047082 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.047120 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.047131 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.047154 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.047166 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:52Z","lastTransitionTime":"2025-11-25T15:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.060593 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.073680 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.084569 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.100582 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.113396 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.123518 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.136798 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.149491 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.149537 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.149549 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.149570 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.149609 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:52Z","lastTransitionTime":"2025-11-25T15:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.170885 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-j87z5"] Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.171310 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:52 crc kubenswrapper[4965]: E1125 15:04:52.171367 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.175451 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.178081 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.178264 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh2jw\" (UniqueName: \"kubernetes.io/projected/6ed72551-610b-4f03-8a57-319ef27e27e0-kube-api-access-lh2jw\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.202891 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.228834 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.245306 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.251736 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.251771 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.251782 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.251800 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.251811 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:52Z","lastTransitionTime":"2025-11-25T15:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.257859 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.272116 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.279139 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh2jw\" (UniqueName: \"kubernetes.io/projected/6ed72551-610b-4f03-8a57-319ef27e27e0-kube-api-access-lh2jw\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.279195 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:52 crc kubenswrapper[4965]: E1125 15:04:52.279321 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:04:52 crc kubenswrapper[4965]: E1125 15:04:52.279366 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs podName:6ed72551-610b-4f03-8a57-319ef27e27e0 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:52.77935354 +0000 UTC m=+37.746947286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs") pod "network-metrics-daemon-j87z5" (UID: "6ed72551-610b-4f03-8a57-319ef27e27e0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.284190 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.300426 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.302242 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh2jw\" (UniqueName: \"kubernetes.io/projected/6ed72551-610b-4f03-8a57-319ef27e27e0-kube-api-access-lh2jw\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.318562 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.332217 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.349995 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.354750 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.354821 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.354836 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.354883 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.354900 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:52Z","lastTransitionTime":"2025-11-25T15:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.376150 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:49.942082 6156 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:04:49.942097 6156 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:04:49.942109 6156 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:49.942114 6156 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:04:49.942177 6156 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:04:49.942188 6156 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:04:49.942199 6156 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:04:49.942218 6156 factory.go:656] Stopping watch factory\\\\nI1125 15:04:49.942231 6156 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:04:49.942243 6156 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:49.942250 6156 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:04:49.942256 6156 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:49.942262 6156 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:04:49.942269 6156 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:04:49.942281 6156 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.389566 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.405738 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.435015 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.449069 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.457095 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.457133 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.457144 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.457160 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.457172 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:52Z","lastTransitionTime":"2025-11-25T15:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.464683 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.474540 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.489080 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.504131 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.517092 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.528906 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.539130 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.547612 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.559607 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.559647 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.559658 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.559675 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.559685 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:52Z","lastTransitionTime":"2025-11-25T15:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.663161 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.663215 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.663238 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.663264 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.663284 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:52Z","lastTransitionTime":"2025-11-25T15:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.765839 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.765874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.765883 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.765895 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.765904 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:52Z","lastTransitionTime":"2025-11-25T15:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.783913 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:52 crc kubenswrapper[4965]: E1125 15:04:52.784171 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:04:52 crc kubenswrapper[4965]: E1125 15:04:52.784259 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs podName:6ed72551-610b-4f03-8a57-319ef27e27e0 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:53.784235727 +0000 UTC m=+38.751829483 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs") pod "network-metrics-daemon-j87z5" (UID: "6ed72551-610b-4f03-8a57-319ef27e27e0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.868827 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.868860 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.868873 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.868889 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.868901 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:52Z","lastTransitionTime":"2025-11-25T15:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.972047 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.972082 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.972094 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.972109 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:52 crc kubenswrapper[4965]: I1125 15:04:52.972121 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:52Z","lastTransitionTime":"2025-11-25T15:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.016030 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/1.log" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.016889 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/0.log" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.023025 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe" exitCode=1 Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.023073 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe"} Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.023112 4965 scope.go:117] "RemoveContainer" containerID="5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.024663 4965 scope.go:117] "RemoveContainer" containerID="9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe" Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.025057 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.046529 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.059858 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.070631 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.074223 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.074266 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.074278 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.074296 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.074308 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.083634 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.097650 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.110371 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.124416 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.137762 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.151955 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.177016 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.177066 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.177112 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.177131 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.177148 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.174516 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:49.942082 6156 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:04:49.942097 6156 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:04:49.942109 6156 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:49.942114 6156 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:04:49.942177 6156 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:04:49.942188 6156 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:04:49.942199 6156 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:04:49.942218 6156 factory.go:656] Stopping watch factory\\\\nI1125 15:04:49.942231 6156 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:04:49.942243 6156 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:49.942250 6156 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:04:49.942256 6156 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:49.942262 6156 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:04:49.942269 6156 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:04:49.942281 6156 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"0] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:52.106884 6317 factory.go:656] Stopping watch factory\\\\nI1125 15:04:52.106911 6317 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:52.106923 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:52.107495 6317 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.107866 6317 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.108164 6317 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.115942 6317 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:04:52.115962 6317 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:04:52.116017 6317 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:04:52.116043 6317 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:04:52.116107 6317 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.191811 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.205394 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.230621 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.243448 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.261423 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.271359 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.279327 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.279393 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.279417 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.279444 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.279465 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.283483 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.402951 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.403011 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.403023 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.403037 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.403046 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.502654 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.502768 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.502813 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.502843 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.502866 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.502954 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503054 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:09.503037526 +0000 UTC m=+54.470631272 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503181 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503238 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503251 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503300 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:09.503284293 +0000 UTC m=+54.470878039 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503325 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503428 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503444 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503452 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503374 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:05:09.503366296 +0000 UTC m=+54.470960042 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503508 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:09.503474699 +0000 UTC m=+54.471068645 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.503532 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:09.50352221 +0000 UTC m=+54.471116276 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.504855 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.504883 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.504891 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.504905 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.504915 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.606802 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.606904 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.606912 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.606926 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.606935 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.613991 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.614020 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.614030 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.614042 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.614050 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.625780 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.628467 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.628495 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.628503 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.628517 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.628525 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.638851 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.641845 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.641884 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.641894 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.641908 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.641917 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.651775 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.654444 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.654472 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.654479 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.654491 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.654500 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.667559 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.670532 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.670610 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.670621 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.670638 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.670649 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.680595 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.680698 4965 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.708359 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.708387 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.708396 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.708432 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.708443 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.771202 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.771208 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.771220 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.771317 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.771407 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.771468 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.774226 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.774950 4965 scope.go:117] "RemoveContainer" containerID="0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3" Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.775395 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.805076 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.805172 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:04:53 crc kubenswrapper[4965]: E1125 15:04:53.805336 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs podName:6ed72551-610b-4f03-8a57-319ef27e27e0 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:55.805206884 +0000 UTC m=+40.772800630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs") pod "network-metrics-daemon-j87z5" (UID: "6ed72551-610b-4f03-8a57-319ef27e27e0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.811506 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.811531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.811540 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.811553 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.811563 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.914466 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.914507 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.914516 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.914531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:53 crc kubenswrapper[4965]: I1125 15:04:53.914540 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:53Z","lastTransitionTime":"2025-11-25T15:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.016819 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.017193 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.017208 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.017524 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.017632 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:54Z","lastTransitionTime":"2025-11-25T15:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.027546 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.029236 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.029602 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.033615 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/1.log" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.036329 4965 scope.go:117] "RemoveContainer" containerID="9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe" Nov 25 15:04:54 crc kubenswrapper[4965]: E1125 15:04:54.036446 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.042815 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.052309 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.063657 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.073099 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.084553 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.094405 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.109468 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.119922 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.119946 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.119954 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.119978 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.119986 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:54Z","lastTransitionTime":"2025-11-25T15:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.128018 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f1d6790abbb2b97c8abe204688b8a15f3c00f2a7b02bb7ab605f739a7af8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:49.942082 6156 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:04:49.942097 6156 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:04:49.942109 6156 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:49.942114 6156 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:04:49.942177 6156 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:04:49.942188 6156 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:04:49.942199 6156 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:04:49.942218 6156 factory.go:656] Stopping watch factory\\\\nI1125 15:04:49.942231 6156 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:04:49.942243 6156 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:49.942250 6156 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:04:49.942256 6156 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:49.942262 6156 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:04:49.942269 6156 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:04:49.942281 6156 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"0] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:52.106884 6317 factory.go:656] Stopping watch factory\\\\nI1125 15:04:52.106911 6317 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:52.106923 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:52.107495 6317 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.107866 6317 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.108164 6317 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.115942 6317 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:04:52.115962 6317 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:04:52.116017 6317 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:04:52.116043 6317 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:04:52.116107 6317 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.136989 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.150590 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.163241 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.173315 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.188071 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.208151 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.221517 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.222300 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.222340 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.222369 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.222385 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.222395 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:54Z","lastTransitionTime":"2025-11-25T15:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.236505 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.262462 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.276113 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.289106 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.302515 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.314832 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.324951 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.325163 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.325238 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.325342 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.325418 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:54Z","lastTransitionTime":"2025-11-25T15:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.330151 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.341448 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.356700 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.367385 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.380477 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.398207 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"0] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:52.106884 6317 factory.go:656] Stopping watch factory\\\\nI1125 15:04:52.106911 6317 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:52.106923 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:52.107495 6317 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.107866 6317 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.108164 6317 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.115942 6317 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:04:52.115962 6317 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:04:52.116017 6317 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:04:52.116043 6317 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:04:52.116107 6317 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.407742 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.420020 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.428073 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.428257 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.428351 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.428459 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.428585 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:54Z","lastTransitionTime":"2025-11-25T15:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.441461 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.454420 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.466719 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.479071 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.489677 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.531748 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.531835 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.531851 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.531875 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.531894 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:54Z","lastTransitionTime":"2025-11-25T15:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.634340 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.634375 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.634387 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.634405 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.634417 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:54Z","lastTransitionTime":"2025-11-25T15:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.736903 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.736933 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.736945 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.736962 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.737000 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:54Z","lastTransitionTime":"2025-11-25T15:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.838622 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.838649 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.838656 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.838670 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.838680 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:54Z","lastTransitionTime":"2025-11-25T15:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.941110 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.941154 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.941165 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.941181 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:54 crc kubenswrapper[4965]: I1125 15:04:54.941194 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:54Z","lastTransitionTime":"2025-11-25T15:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.043150 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.043179 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.043187 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.043200 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.043209 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:55Z","lastTransitionTime":"2025-11-25T15:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.145310 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.145349 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.145361 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.145376 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.145388 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:55Z","lastTransitionTime":"2025-11-25T15:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.247575 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.247610 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.247619 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.247632 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.247642 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:55Z","lastTransitionTime":"2025-11-25T15:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.350090 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.350134 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.350144 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.350160 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.350170 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:55Z","lastTransitionTime":"2025-11-25T15:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.452432 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.452471 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.452483 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.452498 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.452510 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:55Z","lastTransitionTime":"2025-11-25T15:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.555698 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.555764 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.555788 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.555817 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.555840 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:55Z","lastTransitionTime":"2025-11-25T15:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.658761 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.658818 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.658836 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.658861 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.658878 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:55Z","lastTransitionTime":"2025-11-25T15:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.760756 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.760795 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.760805 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.760819 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.760830 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:55Z","lastTransitionTime":"2025-11-25T15:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.771247 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.771305 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.771387 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:55 crc kubenswrapper[4965]: E1125 15:04:55.771551 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.772065 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:55 crc kubenswrapper[4965]: E1125 15:04:55.772156 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:04:55 crc kubenswrapper[4965]: E1125 15:04:55.772223 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:55 crc kubenswrapper[4965]: E1125 15:04:55.772322 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.823403 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:55 crc kubenswrapper[4965]: E1125 15:04:55.823618 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:04:55 crc kubenswrapper[4965]: E1125 15:04:55.823718 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs podName:6ed72551-610b-4f03-8a57-319ef27e27e0 nodeName:}" failed. No retries permitted until 2025-11-25 15:04:59.82368877 +0000 UTC m=+44.791282556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs") pod "network-metrics-daemon-j87z5" (UID: "6ed72551-610b-4f03-8a57-319ef27e27e0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.863727 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.863808 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.863824 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.863845 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.863859 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:55Z","lastTransitionTime":"2025-11-25T15:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.966933 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.967305 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.967407 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.967519 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:55 crc kubenswrapper[4965]: I1125 15:04:55.967603 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:55Z","lastTransitionTime":"2025-11-25T15:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.070149 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.070186 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.070198 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.070214 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.070225 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:56Z","lastTransitionTime":"2025-11-25T15:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.172838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.172880 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.172895 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.172914 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.172929 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:56Z","lastTransitionTime":"2025-11-25T15:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.274874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.274923 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.274948 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.274992 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.275022 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:56Z","lastTransitionTime":"2025-11-25T15:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.378369 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.378427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.378448 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.378477 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.378500 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:56Z","lastTransitionTime":"2025-11-25T15:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.481543 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.482081 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.482166 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.482242 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.482309 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:56Z","lastTransitionTime":"2025-11-25T15:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.584930 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.585269 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.585366 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.585454 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.585514 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:56Z","lastTransitionTime":"2025-11-25T15:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.687457 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.687491 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.687502 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.687518 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.687530 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:56Z","lastTransitionTime":"2025-11-25T15:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.789319 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.790880 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.790899 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.790907 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.790920 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.790929 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:56Z","lastTransitionTime":"2025-11-25T15:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.805322 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.824672 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.841812 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.863133 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.876009 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.887117 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.893014 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.893054 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.893063 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.893079 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.893089 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:56Z","lastTransitionTime":"2025-11-25T15:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.900557 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.911431 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.924026 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.941556 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"0] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:52.106884 6317 factory.go:656] Stopping watch factory\\\\nI1125 15:04:52.106911 6317 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:52.106923 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:52.107495 6317 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.107866 6317 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.108164 6317 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.115942 6317 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:04:52.115962 6317 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:04:52.116017 6317 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:04:52.116043 6317 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:04:52.116107 6317 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.953193 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.965833 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.983475 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.995180 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.995249 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.995260 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.995274 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:56 crc kubenswrapper[4965]: I1125 15:04:56.995284 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:56Z","lastTransitionTime":"2025-11-25T15:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.000415 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.016572 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.028603 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:04:57Z is after 2025-08-24T17:21:41Z" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.097454 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.097485 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.097496 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.097511 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.097522 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:57Z","lastTransitionTime":"2025-11-25T15:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.200460 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.200503 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.200517 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.200536 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.200550 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:57Z","lastTransitionTime":"2025-11-25T15:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.308051 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.308079 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.308087 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.308101 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.308109 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:57Z","lastTransitionTime":"2025-11-25T15:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.411476 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.411526 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.411543 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.411568 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.411586 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:57Z","lastTransitionTime":"2025-11-25T15:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.514395 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.514424 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.514435 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.514451 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.514461 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:57Z","lastTransitionTime":"2025-11-25T15:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.617431 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.617496 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.617520 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.617549 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.617573 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:57Z","lastTransitionTime":"2025-11-25T15:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.721254 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.721329 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.721354 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.721383 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.721405 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:57Z","lastTransitionTime":"2025-11-25T15:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.771498 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:57 crc kubenswrapper[4965]: E1125 15:04:57.771678 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.772129 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.772231 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:57 crc kubenswrapper[4965]: E1125 15:04:57.772366 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.772400 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:57 crc kubenswrapper[4965]: E1125 15:04:57.772508 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:57 crc kubenswrapper[4965]: E1125 15:04:57.772605 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.823884 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.823916 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.823929 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.823944 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.823956 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:57Z","lastTransitionTime":"2025-11-25T15:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.926524 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.926551 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.926561 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.926574 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:57 crc kubenswrapper[4965]: I1125 15:04:57.926582 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:57Z","lastTransitionTime":"2025-11-25T15:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.028547 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.028596 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.028605 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.028619 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.028628 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:58Z","lastTransitionTime":"2025-11-25T15:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.131880 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.131944 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.131999 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.132031 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.132056 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:58Z","lastTransitionTime":"2025-11-25T15:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.235274 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.235348 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.235371 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.235401 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.235424 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:58Z","lastTransitionTime":"2025-11-25T15:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.338874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.338921 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.338938 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.338993 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.339012 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:58Z","lastTransitionTime":"2025-11-25T15:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.441735 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.441782 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.441804 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.441831 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.441878 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:58Z","lastTransitionTime":"2025-11-25T15:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.545045 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.545081 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.545092 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.545109 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.545123 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:58Z","lastTransitionTime":"2025-11-25T15:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.648715 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.648783 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.648806 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.648836 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.648854 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:58Z","lastTransitionTime":"2025-11-25T15:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.750812 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.750841 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.750852 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.750868 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.750879 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:58Z","lastTransitionTime":"2025-11-25T15:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.852779 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.852814 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.852822 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.852835 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.852844 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:58Z","lastTransitionTime":"2025-11-25T15:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.957111 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.957158 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.957178 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.957202 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:58 crc kubenswrapper[4965]: I1125 15:04:58.957221 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:58Z","lastTransitionTime":"2025-11-25T15:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.059733 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.059770 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.059780 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.059794 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.059805 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:59Z","lastTransitionTime":"2025-11-25T15:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.161518 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.161561 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.161573 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.161614 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.161627 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:59Z","lastTransitionTime":"2025-11-25T15:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.264834 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.264907 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.264931 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.264960 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.265035 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:59Z","lastTransitionTime":"2025-11-25T15:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.378704 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.378764 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.378780 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.378811 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.378846 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:59Z","lastTransitionTime":"2025-11-25T15:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.481821 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.481868 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.481879 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.481893 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.481902 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:59Z","lastTransitionTime":"2025-11-25T15:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.585237 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.585330 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.585354 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.585896 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.586199 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:59Z","lastTransitionTime":"2025-11-25T15:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.689336 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.689401 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.689417 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.689439 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.689452 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:59Z","lastTransitionTime":"2025-11-25T15:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.770493 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.770572 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:04:59 crc kubenswrapper[4965]: E1125 15:04:59.770624 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.770651 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:04:59 crc kubenswrapper[4965]: E1125 15:04:59.770700 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.770579 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:59 crc kubenswrapper[4965]: E1125 15:04:59.770862 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:04:59 crc kubenswrapper[4965]: E1125 15:04:59.770997 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.792233 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.792259 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.792269 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.792284 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.792295 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:59Z","lastTransitionTime":"2025-11-25T15:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.876723 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:04:59 crc kubenswrapper[4965]: E1125 15:04:59.876876 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:04:59 crc kubenswrapper[4965]: E1125 15:04:59.876942 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs podName:6ed72551-610b-4f03-8a57-319ef27e27e0 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:07.876923722 +0000 UTC m=+52.844517478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs") pod "network-metrics-daemon-j87z5" (UID: "6ed72551-610b-4f03-8a57-319ef27e27e0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.895306 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.895337 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.895345 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.895359 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.895367 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:59Z","lastTransitionTime":"2025-11-25T15:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.997757 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.997831 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.997849 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.997874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:04:59 crc kubenswrapper[4965]: I1125 15:04:59.997892 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:04:59Z","lastTransitionTime":"2025-11-25T15:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.101284 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.101357 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.101379 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.101410 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.101436 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:00Z","lastTransitionTime":"2025-11-25T15:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.204817 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.204879 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.204895 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.204921 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.204939 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:00Z","lastTransitionTime":"2025-11-25T15:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.307719 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.307779 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.307796 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.307821 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.307837 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:00Z","lastTransitionTime":"2025-11-25T15:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.410785 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.410840 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.410860 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.410883 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.410900 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:00Z","lastTransitionTime":"2025-11-25T15:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.513529 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.513587 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.513740 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.513782 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.513806 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:00Z","lastTransitionTime":"2025-11-25T15:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.616149 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.616193 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.616210 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.616231 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.616250 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:00Z","lastTransitionTime":"2025-11-25T15:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.718477 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.718514 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.718529 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.718550 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.718569 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:00Z","lastTransitionTime":"2025-11-25T15:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.820681 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.820760 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.820789 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.820832 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.820850 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:00Z","lastTransitionTime":"2025-11-25T15:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.923170 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.923202 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.923213 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.923227 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:00 crc kubenswrapper[4965]: I1125 15:05:00.923239 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:00Z","lastTransitionTime":"2025-11-25T15:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.026368 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.026396 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.026407 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.026425 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.026436 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:01Z","lastTransitionTime":"2025-11-25T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.128765 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.128827 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.128845 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.128874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.128896 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:01Z","lastTransitionTime":"2025-11-25T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.231737 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.231776 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.231787 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.231802 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.231813 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:01Z","lastTransitionTime":"2025-11-25T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.334347 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.334389 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.334400 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.334418 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.334430 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:01Z","lastTransitionTime":"2025-11-25T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.436610 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.436640 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.436649 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.436661 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.436669 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:01Z","lastTransitionTime":"2025-11-25T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.539861 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.539925 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.539939 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.539958 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.540004 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:01Z","lastTransitionTime":"2025-11-25T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.642796 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.642835 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.642847 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.642864 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.642876 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:01Z","lastTransitionTime":"2025-11-25T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.745085 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.745135 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.745150 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.745172 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.745187 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:01Z","lastTransitionTime":"2025-11-25T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.770736 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.770751 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.770772 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.770787 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:01 crc kubenswrapper[4965]: E1125 15:05:01.771440 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:01 crc kubenswrapper[4965]: E1125 15:05:01.771548 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:01 crc kubenswrapper[4965]: E1125 15:05:01.771635 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:01 crc kubenswrapper[4965]: E1125 15:05:01.771714 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.847585 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.847706 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.847729 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.847753 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.847769 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:01Z","lastTransitionTime":"2025-11-25T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.951054 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.951096 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.951111 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.951133 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:01 crc kubenswrapper[4965]: I1125 15:05:01.951148 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:01Z","lastTransitionTime":"2025-11-25T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.053647 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.053682 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.053694 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.053710 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.053722 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:02Z","lastTransitionTime":"2025-11-25T15:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.156661 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.156723 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.156747 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.156780 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.156802 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:02Z","lastTransitionTime":"2025-11-25T15:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.259321 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.259375 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.259386 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.259405 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.259416 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:02Z","lastTransitionTime":"2025-11-25T15:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.361388 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.361418 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.361427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.361441 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.361450 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:02Z","lastTransitionTime":"2025-11-25T15:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.463472 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.463505 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.463516 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.463534 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.463548 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:02Z","lastTransitionTime":"2025-11-25T15:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.566676 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.566723 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.566735 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.566754 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.566765 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:02Z","lastTransitionTime":"2025-11-25T15:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.668938 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.669063 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.669083 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.669107 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.669124 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:02Z","lastTransitionTime":"2025-11-25T15:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.771617 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.771945 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.772044 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.772116 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.772207 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:02Z","lastTransitionTime":"2025-11-25T15:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.894319 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.894396 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.894427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.894460 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.894484 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:02Z","lastTransitionTime":"2025-11-25T15:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.997914 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.998065 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.998136 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.998172 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:02 crc kubenswrapper[4965]: I1125 15:05:02.998233 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:02Z","lastTransitionTime":"2025-11-25T15:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.101019 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.101103 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.101128 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.101166 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.101192 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:03Z","lastTransitionTime":"2025-11-25T15:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.203949 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.204042 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.204055 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.204084 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.204097 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:03Z","lastTransitionTime":"2025-11-25T15:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.307065 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.307104 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.307112 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.307129 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.307140 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:03Z","lastTransitionTime":"2025-11-25T15:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.409945 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.409998 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.410011 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.410027 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.410040 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:03Z","lastTransitionTime":"2025-11-25T15:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.512548 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.512596 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.512605 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.512621 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.512630 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:03Z","lastTransitionTime":"2025-11-25T15:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.615009 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.615049 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.615060 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.615078 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.615089 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:03Z","lastTransitionTime":"2025-11-25T15:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.718075 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.718145 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.718161 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.718197 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.718216 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:03Z","lastTransitionTime":"2025-11-25T15:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.771561 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.771703 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.771575 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:03 crc kubenswrapper[4965]: E1125 15:05:03.771774 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.771831 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:03 crc kubenswrapper[4965]: E1125 15:05:03.772050 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:03 crc kubenswrapper[4965]: E1125 15:05:03.772143 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:03 crc kubenswrapper[4965]: E1125 15:05:03.772273 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.821875 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.821923 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.821959 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.821994 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.822004 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:03Z","lastTransitionTime":"2025-11-25T15:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.925123 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.925157 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.925166 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.925182 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:03 crc kubenswrapper[4965]: I1125 15:05:03.925193 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:03Z","lastTransitionTime":"2025-11-25T15:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.009749 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.009804 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.009821 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.009845 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.009874 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: E1125 15:05:04.025412 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.031873 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.031914 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.031928 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.031950 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.031995 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: E1125 15:05:04.047450 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.052614 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.052659 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.052668 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.052690 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.052701 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: E1125 15:05:04.067103 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.072245 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.072291 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.072326 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.072351 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.072367 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: E1125 15:05:04.087950 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.092082 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.092128 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.092143 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.092163 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.092179 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: E1125 15:05:04.106844 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:04 crc kubenswrapper[4965]: E1125 15:05:04.107020 4965 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.109576 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.109615 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.109633 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.109655 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.109673 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.212493 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.212539 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.212556 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.212582 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.212600 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.315475 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.315536 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.315551 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.315574 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.315590 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.418763 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.418824 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.418834 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.418849 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.418858 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.521778 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.521817 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.521825 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.521838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.521848 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.623882 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.624007 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.624042 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.624072 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.624097 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.726582 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.726651 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.726675 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.726705 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.726733 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.830348 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.830409 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.830430 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.830457 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.830478 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.933639 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.933702 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.933714 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.933730 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:04 crc kubenswrapper[4965]: I1125 15:05:04.933743 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:04Z","lastTransitionTime":"2025-11-25T15:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.037117 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.037175 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.037192 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.037213 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.037225 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:05Z","lastTransitionTime":"2025-11-25T15:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.139744 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.139794 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.139806 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.139823 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.139835 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:05Z","lastTransitionTime":"2025-11-25T15:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.242794 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.242849 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.242865 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.242885 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.242898 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:05Z","lastTransitionTime":"2025-11-25T15:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.344651 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.344692 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.344708 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.344725 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.344737 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:05Z","lastTransitionTime":"2025-11-25T15:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.447957 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.448012 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.448025 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.448044 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.448058 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:05Z","lastTransitionTime":"2025-11-25T15:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.550021 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.550061 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.550081 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.550109 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.550125 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:05Z","lastTransitionTime":"2025-11-25T15:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.652358 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.652422 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.652442 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.652465 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.652482 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:05Z","lastTransitionTime":"2025-11-25T15:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.754418 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.754474 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.754489 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.754514 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.754536 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:05Z","lastTransitionTime":"2025-11-25T15:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.771150 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.771213 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.771163 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.771165 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:05 crc kubenswrapper[4965]: E1125 15:05:05.771452 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:05 crc kubenswrapper[4965]: E1125 15:05:05.771346 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:05 crc kubenswrapper[4965]: E1125 15:05:05.772184 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:05 crc kubenswrapper[4965]: E1125 15:05:05.772325 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.856941 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.857004 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.857017 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.857033 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.857045 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:05Z","lastTransitionTime":"2025-11-25T15:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.958730 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.958786 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.958806 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.958836 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:05 crc kubenswrapper[4965]: I1125 15:05:05.958858 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:05Z","lastTransitionTime":"2025-11-25T15:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.071807 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.071842 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.071852 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.071869 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.071880 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:06Z","lastTransitionTime":"2025-11-25T15:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.173206 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.173232 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.173240 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.173252 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.173261 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:06Z","lastTransitionTime":"2025-11-25T15:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.276220 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.276312 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.276332 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.276362 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.276384 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:06Z","lastTransitionTime":"2025-11-25T15:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.379412 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.379454 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.379468 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.379486 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.379498 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:06Z","lastTransitionTime":"2025-11-25T15:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.481640 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.481693 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.481707 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.481726 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.481742 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:06Z","lastTransitionTime":"2025-11-25T15:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.584064 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.584103 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.584114 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.584128 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.584140 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:06Z","lastTransitionTime":"2025-11-25T15:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.686780 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.686838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.686853 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.686874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.686891 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:06Z","lastTransitionTime":"2025-11-25T15:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.790573 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.790632 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.790654 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.790683 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.790704 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:06Z","lastTransitionTime":"2025-11-25T15:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.795933 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.811837 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.827235 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.842717 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.857497 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.871025 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.885166 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.893180 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.893226 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.893240 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.893260 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.893274 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:06Z","lastTransitionTime":"2025-11-25T15:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.900723 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.919032 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"0] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:52.106884 6317 factory.go:656] Stopping watch factory\\\\nI1125 15:04:52.106911 6317 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:52.106923 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:52.107495 6317 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.107866 6317 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.108164 6317 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.115942 6317 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:04:52.115962 6317 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:04:52.116017 6317 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:04:52.116043 6317 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:04:52.116107 6317 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.931566 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.945044 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.958514 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.977693 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.988866 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.995450 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.995500 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.995513 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.995534 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:06 crc kubenswrapper[4965]: I1125 15:05:06.995547 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:06Z","lastTransitionTime":"2025-11-25T15:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.001111 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.009890 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.021521 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.097285 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.097337 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.097346 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.097360 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.097369 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:07Z","lastTransitionTime":"2025-11-25T15:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.200128 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.200188 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.200206 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.200236 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.200254 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:07Z","lastTransitionTime":"2025-11-25T15:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.304301 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.304358 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.304375 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.304397 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.304415 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:07Z","lastTransitionTime":"2025-11-25T15:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.355368 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.366373 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.384252 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.402738 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.406867 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.406931 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.406944 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.406998 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.407020 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:07Z","lastTransitionTime":"2025-11-25T15:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.418587 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.428130 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.442142 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.458253 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.470379 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.482263 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.494232 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.504063 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.510255 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.510294 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.510304 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.510320 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.510332 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:07Z","lastTransitionTime":"2025-11-25T15:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.515102 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.526983 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.541158 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.567324 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"0] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:52.106884 6317 factory.go:656] Stopping watch factory\\\\nI1125 15:04:52.106911 6317 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:52.106923 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:52.107495 6317 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.107866 6317 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.108164 6317 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.115942 6317 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:04:52.115962 6317 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:04:52.116017 6317 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:04:52.116043 6317 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:04:52.116107 6317 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.580080 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.592224 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.611416 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.612804 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.612830 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.612841 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.612856 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.612868 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:07Z","lastTransitionTime":"2025-11-25T15:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.715823 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.715849 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.715859 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.715870 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.715879 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:07Z","lastTransitionTime":"2025-11-25T15:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.771484 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.771517 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.771609 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:07 crc kubenswrapper[4965]: E1125 15:05:07.771604 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:07 crc kubenswrapper[4965]: E1125 15:05:07.771727 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.771793 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:07 crc kubenswrapper[4965]: E1125 15:05:07.771927 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:07 crc kubenswrapper[4965]: E1125 15:05:07.772054 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.817944 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.818167 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.818180 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.818195 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.818206 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:07Z","lastTransitionTime":"2025-11-25T15:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.888795 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:07 crc kubenswrapper[4965]: E1125 15:05:07.889026 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:05:07 crc kubenswrapper[4965]: E1125 15:05:07.889136 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs podName:6ed72551-610b-4f03-8a57-319ef27e27e0 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:23.889108458 +0000 UTC m=+68.856702234 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs") pod "network-metrics-daemon-j87z5" (UID: "6ed72551-610b-4f03-8a57-319ef27e27e0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.920228 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.920270 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.920287 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.920307 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:07 crc kubenswrapper[4965]: I1125 15:05:07.920322 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:07Z","lastTransitionTime":"2025-11-25T15:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.023031 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.023062 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.023070 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.023082 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.023091 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:08Z","lastTransitionTime":"2025-11-25T15:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.124759 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.124829 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.124854 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.124884 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.124906 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:08Z","lastTransitionTime":"2025-11-25T15:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.227446 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.227507 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.227528 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.227551 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.227569 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:08Z","lastTransitionTime":"2025-11-25T15:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.333853 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.333895 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.333905 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.333924 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.333936 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:08Z","lastTransitionTime":"2025-11-25T15:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.436103 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.436154 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.436170 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.436193 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.436210 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:08Z","lastTransitionTime":"2025-11-25T15:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.539180 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.539251 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.539276 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.539304 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.539325 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:08Z","lastTransitionTime":"2025-11-25T15:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.641675 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.641729 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.641745 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.641769 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.641819 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:08Z","lastTransitionTime":"2025-11-25T15:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.744449 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.744516 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.744537 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.744565 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.744587 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:08Z","lastTransitionTime":"2025-11-25T15:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.847683 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.847736 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.847748 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.847766 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.847778 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:08Z","lastTransitionTime":"2025-11-25T15:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.951479 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.951568 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.951590 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.951620 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:08 crc kubenswrapper[4965]: I1125 15:05:08.951638 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:08Z","lastTransitionTime":"2025-11-25T15:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.054485 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.054544 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.054562 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.054588 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.054605 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:09Z","lastTransitionTime":"2025-11-25T15:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.157949 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.158027 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.158043 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.158063 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.158078 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:09Z","lastTransitionTime":"2025-11-25T15:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.260805 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.260900 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.260933 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.261034 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.261063 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:09Z","lastTransitionTime":"2025-11-25T15:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.365013 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.365080 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.365104 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.365130 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.365146 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:09Z","lastTransitionTime":"2025-11-25T15:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.468747 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.468813 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.468831 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.468855 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.468873 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:09Z","lastTransitionTime":"2025-11-25T15:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.571542 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.571608 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.571696 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.571732 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.571753 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:09Z","lastTransitionTime":"2025-11-25T15:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.603096 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.603184 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.603220 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.603247 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603309 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:05:41.603271233 +0000 UTC m=+86.570865009 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603368 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.603392 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603422 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:41.603407937 +0000 UTC m=+86.571001683 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603469 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603507 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603526 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603319 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603560 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:41.603553692 +0000 UTC m=+86.571147438 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603628 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:41.603571062 +0000 UTC m=+86.571164848 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603729 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603766 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603788 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.603861 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:41.60383347 +0000 UTC m=+86.571427256 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.674945 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.675048 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.675071 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.675098 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.675121 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:09Z","lastTransitionTime":"2025-11-25T15:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.770880 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.770938 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.771035 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.770894 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.771157 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.771812 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.771883 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:09 crc kubenswrapper[4965]: E1125 15:05:09.771947 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.772480 4965 scope.go:117] "RemoveContainer" containerID="9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.778247 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.778307 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.778332 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.778363 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.778387 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:09Z","lastTransitionTime":"2025-11-25T15:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.881589 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.881652 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.881669 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.881695 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.881722 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:09Z","lastTransitionTime":"2025-11-25T15:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.986622 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.987000 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.987009 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.987025 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:09 crc kubenswrapper[4965]: I1125 15:05:09.987037 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:09Z","lastTransitionTime":"2025-11-25T15:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.089948 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.090024 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.090035 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.090051 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.090063 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:10Z","lastTransitionTime":"2025-11-25T15:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.129556 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/1.log" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.132810 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78"} Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.133998 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.149276 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.162186 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.172441 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.185689 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.191858 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.191875 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.191882 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.191895 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.191903 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:10Z","lastTransitionTime":"2025-11-25T15:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.201581 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.214268 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.233449 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"0] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:52.106884 6317 factory.go:656] Stopping watch factory\\\\nI1125 15:04:52.106911 6317 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:52.106923 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:52.107495 6317 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.107866 6317 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.108164 6317 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.115942 6317 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:04:52.115962 6317 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:04:52.116017 6317 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:04:52.116043 6317 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:04:52.116107 6317 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.249751 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.269693 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.288608 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.293564 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.293587 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.293596 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.293611 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.293620 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:10Z","lastTransitionTime":"2025-11-25T15:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.303077 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.316627 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.329954 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.338463 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.350017 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.359856 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.377635 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.389722 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.396249 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.396283 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.396291 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.396305 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.396318 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:10Z","lastTransitionTime":"2025-11-25T15:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.498465 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.498504 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.498512 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.498527 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.498536 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:10Z","lastTransitionTime":"2025-11-25T15:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.601060 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.601093 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.601102 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.601115 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.601125 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:10Z","lastTransitionTime":"2025-11-25T15:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.607067 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.623829 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.633413 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.647417 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.663099 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.686830 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.700962 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.703591 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.703663 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.703683 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.703714 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.703741 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:10Z","lastTransitionTime":"2025-11-25T15:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.720686 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.735593 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.748794 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.762242 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.776573 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.791271 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.806619 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.806668 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.806681 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.806699 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.806711 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:10Z","lastTransitionTime":"2025-11-25T15:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.809183 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"0] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:52.106884 6317 factory.go:656] Stopping watch factory\\\\nI1125 15:04:52.106911 6317 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:52.106923 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:52.107495 6317 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.107866 6317 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.108164 6317 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.115942 6317 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:04:52.115962 6317 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:04:52.116017 6317 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:04:52.116043 6317 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:04:52.116107 6317 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.819329 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.830097 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.841731 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.850268 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.865547 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.908713 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.908747 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.908758 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.908774 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:10 crc kubenswrapper[4965]: I1125 15:05:10.908784 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:10Z","lastTransitionTime":"2025-11-25T15:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.011875 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.012268 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.012291 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.012307 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.012318 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:11Z","lastTransitionTime":"2025-11-25T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.114988 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.115029 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.115040 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.115058 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.115069 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:11Z","lastTransitionTime":"2025-11-25T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.138661 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/2.log" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.139419 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/1.log" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.142596 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78" exitCode=1 Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.142630 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.142671 4965 scope.go:117] "RemoveContainer" containerID="9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.143353 4965 scope.go:117] "RemoveContainer" containerID="25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78" Nov 25 15:05:11 crc kubenswrapper[4965]: E1125 15:05:11.143649 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.161538 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.175830 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.185048 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.200599 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.217253 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.217300 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.217310 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.217323 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.217332 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:11Z","lastTransitionTime":"2025-11-25T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.218195 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9794e7f31d2fa924c2cf1ba84a030a24c7d2dc8c3c0531ea04a1e84a8654b8fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"0] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:04:52.106884 6317 factory.go:656] Stopping watch factory\\\\nI1125 15:04:52.106911 6317 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:04:52.106923 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:04:52.107495 6317 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.107866 6317 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.108164 6317 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:04:52.115942 6317 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:04:52.115962 6317 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:04:52.116017 6317 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:04:52.116043 6317 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:04:52.116107 6317 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"message\\\":\\\"0\\\\nI1125 15:05:10.592499 6555 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 15:05:10.592705 6555 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593333 6555 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593479 6555 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593723 6555 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593831 6555 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.594194 6555 factory.go:656] Stopping watch factory\\\\nI1125 15:05:10.645914 6555 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:05:10.645939 6555 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:05:10.646007 6555 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:05:10.646044 6555 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:05:10.646137 6555 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.229864 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.240871 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.251757 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.273486 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.287506 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.298028 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.305806 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.320141 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.321034 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.321078 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.321095 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.321119 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.321134 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:11Z","lastTransitionTime":"2025-11-25T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.335191 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.345626 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.354399 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.364607 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.374119 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.422726 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.422761 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.422769 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.422782 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.422791 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:11Z","lastTransitionTime":"2025-11-25T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.525360 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.525398 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.525409 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.525424 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.525435 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:11Z","lastTransitionTime":"2025-11-25T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.628506 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.628565 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.628582 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.628608 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.628628 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:11Z","lastTransitionTime":"2025-11-25T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.733388 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.733466 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.733490 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.733531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.733610 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:11Z","lastTransitionTime":"2025-11-25T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.770950 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.771092 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.771051 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.771050 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:11 crc kubenswrapper[4965]: E1125 15:05:11.771279 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:11 crc kubenswrapper[4965]: E1125 15:05:11.771375 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:11 crc kubenswrapper[4965]: E1125 15:05:11.771460 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:11 crc kubenswrapper[4965]: E1125 15:05:11.771628 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.837088 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.837132 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.837143 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.837159 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.837171 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:11Z","lastTransitionTime":"2025-11-25T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.940399 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.940461 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.940474 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.940497 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:11 crc kubenswrapper[4965]: I1125 15:05:11.940511 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:11Z","lastTransitionTime":"2025-11-25T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.042776 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.042809 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.042817 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.042829 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.042838 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:12Z","lastTransitionTime":"2025-11-25T15:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.145466 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.145508 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.145520 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.145537 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.145548 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:12Z","lastTransitionTime":"2025-11-25T15:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.147145 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/2.log" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.151036 4965 scope.go:117] "RemoveContainer" containerID="25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78" Nov 25 15:05:12 crc kubenswrapper[4965]: E1125 15:05:12.151261 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.166924 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.181510 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.195872 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.210905 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.229063 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.241762 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.247872 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.247912 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.247923 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.247941 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.247954 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:12Z","lastTransitionTime":"2025-11-25T15:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.256017 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.269368 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.279955 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.297904 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.315765 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"message\\\":\\\"0\\\\nI1125 15:05:10.592499 6555 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 15:05:10.592705 6555 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593333 6555 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593479 6555 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593723 6555 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593831 6555 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.594194 6555 factory.go:656] Stopping watch factory\\\\nI1125 15:05:10.645914 6555 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:05:10.645939 6555 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:05:10.646007 6555 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:05:10.646044 6555 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:05:10.646137 6555 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.330306 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.346935 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.351109 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.351171 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.351188 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.351209 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.351223 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:12Z","lastTransitionTime":"2025-11-25T15:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.362326 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.385639 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.399692 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.411746 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.423313 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.453346 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.453419 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.453434 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.453453 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.453464 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:12Z","lastTransitionTime":"2025-11-25T15:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.555493 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.555524 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.555533 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.555550 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.555559 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:12Z","lastTransitionTime":"2025-11-25T15:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.658265 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.658301 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.658310 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.658323 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.658331 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:12Z","lastTransitionTime":"2025-11-25T15:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.760687 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.760759 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.760772 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.760789 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.760800 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:12Z","lastTransitionTime":"2025-11-25T15:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.862428 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.862456 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.862464 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.862477 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.862486 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:12Z","lastTransitionTime":"2025-11-25T15:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.964405 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.964438 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.964449 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.964464 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:12 crc kubenswrapper[4965]: I1125 15:05:12.964474 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:12Z","lastTransitionTime":"2025-11-25T15:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.066845 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.066878 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.066904 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.066922 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.066947 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:13Z","lastTransitionTime":"2025-11-25T15:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.169246 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.169280 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.169289 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.169304 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.169314 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:13Z","lastTransitionTime":"2025-11-25T15:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.272351 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.272435 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.272457 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.272482 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.272499 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:13Z","lastTransitionTime":"2025-11-25T15:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.376046 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.376099 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.376118 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.376141 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.376157 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:13Z","lastTransitionTime":"2025-11-25T15:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.479443 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.479514 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.479534 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.479562 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.479580 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:13Z","lastTransitionTime":"2025-11-25T15:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.583565 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.583626 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.583657 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.583680 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.583697 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:13Z","lastTransitionTime":"2025-11-25T15:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.686560 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.686593 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.686603 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.686620 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.686632 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:13Z","lastTransitionTime":"2025-11-25T15:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.771166 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.771253 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.771260 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:13 crc kubenswrapper[4965]: E1125 15:05:13.771428 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.771442 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:13 crc kubenswrapper[4965]: E1125 15:05:13.771527 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:13 crc kubenswrapper[4965]: E1125 15:05:13.771610 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:13 crc kubenswrapper[4965]: E1125 15:05:13.771662 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.789844 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.789907 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.789919 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.789940 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.789952 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:13Z","lastTransitionTime":"2025-11-25T15:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.892770 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.892813 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.892823 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.892839 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.892848 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:13Z","lastTransitionTime":"2025-11-25T15:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.996292 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.996392 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.996414 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.996441 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:13 crc kubenswrapper[4965]: I1125 15:05:13.996460 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:13Z","lastTransitionTime":"2025-11-25T15:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.099296 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.099378 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.099397 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.099421 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.099437 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.202513 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.202585 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.202607 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.202634 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.202656 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.305027 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.305066 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.305074 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.305087 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.305096 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.407887 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.407958 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.408016 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.408046 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.408069 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.505901 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.505956 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.506019 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.506051 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.506065 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: E1125 15:05:14.526810 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.532414 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.532502 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.532553 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.532582 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.532648 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: E1125 15:05:14.554553 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.558799 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.558852 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.558870 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.558891 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.558906 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: E1125 15:05:14.577893 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.583038 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.583160 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.583225 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.583257 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.583320 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: E1125 15:05:14.604850 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.609828 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.609877 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.609893 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.609913 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.609926 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: E1125 15:05:14.621168 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:14 crc kubenswrapper[4965]: E1125 15:05:14.621327 4965 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.622657 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.622699 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.622715 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.622736 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.622751 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.724825 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.724881 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.724896 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.724915 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.724930 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.827030 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.827078 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.827090 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.827110 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.827129 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.930289 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.930347 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.930357 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.930380 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:14 crc kubenswrapper[4965]: I1125 15:05:14.930392 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:14Z","lastTransitionTime":"2025-11-25T15:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.035754 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.035819 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.035836 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.035861 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.035880 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:15Z","lastTransitionTime":"2025-11-25T15:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.139645 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.139732 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.139758 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.139793 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.139817 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:15Z","lastTransitionTime":"2025-11-25T15:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.243209 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.243272 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.243293 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.243321 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.243341 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:15Z","lastTransitionTime":"2025-11-25T15:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.346641 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.346712 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.346723 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.346742 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.346755 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:15Z","lastTransitionTime":"2025-11-25T15:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.450410 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.450483 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.450504 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.450531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.450552 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:15Z","lastTransitionTime":"2025-11-25T15:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.553403 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.553729 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.553918 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.554244 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.554652 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:15Z","lastTransitionTime":"2025-11-25T15:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.659194 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.659268 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.659291 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.659319 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.659338 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:15Z","lastTransitionTime":"2025-11-25T15:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.762319 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.762595 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.762820 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.763074 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.763304 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:15Z","lastTransitionTime":"2025-11-25T15:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.770836 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.770894 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.770902 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.770841 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:15 crc kubenswrapper[4965]: E1125 15:05:15.771066 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:15 crc kubenswrapper[4965]: E1125 15:05:15.771177 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:15 crc kubenswrapper[4965]: E1125 15:05:15.771488 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:15 crc kubenswrapper[4965]: E1125 15:05:15.771681 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.866315 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.866590 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.866661 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.866729 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.866788 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:15Z","lastTransitionTime":"2025-11-25T15:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.969613 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.969669 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.969690 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.969713 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:15 crc kubenswrapper[4965]: I1125 15:05:15.969730 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:15Z","lastTransitionTime":"2025-11-25T15:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.073173 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.074155 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.074191 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.074215 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.074232 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:16Z","lastTransitionTime":"2025-11-25T15:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.177020 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.177075 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.177096 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.177121 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.177141 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:16Z","lastTransitionTime":"2025-11-25T15:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.280370 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.280433 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.280452 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.280476 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.280495 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:16Z","lastTransitionTime":"2025-11-25T15:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.383288 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.383330 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.383346 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.383367 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.383382 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:16Z","lastTransitionTime":"2025-11-25T15:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.486705 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.487009 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.487145 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.487253 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.487342 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:16Z","lastTransitionTime":"2025-11-25T15:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.589040 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.589140 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.589156 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.589171 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.589180 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:16Z","lastTransitionTime":"2025-11-25T15:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.691784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.691857 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.691871 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.691886 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.691897 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:16Z","lastTransitionTime":"2025-11-25T15:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.813641 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.813940 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.813949 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.813980 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.813991 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:16Z","lastTransitionTime":"2025-11-25T15:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.822988 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.852472 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"message\\\":\\\"0\\\\nI1125 15:05:10.592499 6555 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 15:05:10.592705 6555 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593333 6555 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593479 6555 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593723 6555 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593831 6555 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.594194 6555 factory.go:656] Stopping watch factory\\\\nI1125 15:05:10.645914 6555 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:05:10.645939 6555 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:05:10.646007 6555 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:05:10.646044 6555 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:05:10.646137 6555 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.864949 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.878065 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.891423 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.901926 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.911784 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.915599 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.915618 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.915626 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.915638 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.915647 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:16Z","lastTransitionTime":"2025-11-25T15:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.923734 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.933282 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.945690 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.957148 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.978732 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:16 crc kubenswrapper[4965]: I1125 15:05:16.992869 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.005446 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.016802 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.018795 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.018837 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.018847 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.018864 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.018877 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:17Z","lastTransitionTime":"2025-11-25T15:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.028958 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.040901 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.053164 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.121024 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.121048 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.121056 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.121067 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.121076 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:17Z","lastTransitionTime":"2025-11-25T15:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.223081 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.223126 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.223138 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.223153 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.223163 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:17Z","lastTransitionTime":"2025-11-25T15:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.325491 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.325534 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.325549 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.325569 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.325583 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:17Z","lastTransitionTime":"2025-11-25T15:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.431613 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.431663 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.431679 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.431703 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.431719 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:17Z","lastTransitionTime":"2025-11-25T15:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.534609 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.534645 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.534656 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.534671 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.534683 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:17Z","lastTransitionTime":"2025-11-25T15:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.636905 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.636961 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.637009 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.637031 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.637047 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:17Z","lastTransitionTime":"2025-11-25T15:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.739730 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.739771 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.739782 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.739798 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.739810 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:17Z","lastTransitionTime":"2025-11-25T15:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.771457 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.771521 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.771527 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.771574 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:17 crc kubenswrapper[4965]: E1125 15:05:17.772531 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:17 crc kubenswrapper[4965]: E1125 15:05:17.772689 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:17 crc kubenswrapper[4965]: E1125 15:05:17.772856 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:17 crc kubenswrapper[4965]: E1125 15:05:17.772904 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.842698 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.842838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.842862 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.842889 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.842913 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:17Z","lastTransitionTime":"2025-11-25T15:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.945885 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.946299 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.946472 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.946665 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:17 crc kubenswrapper[4965]: I1125 15:05:17.946882 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:17Z","lastTransitionTime":"2025-11-25T15:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.049840 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.049874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.049900 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.049913 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.049924 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:18Z","lastTransitionTime":"2025-11-25T15:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.153592 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.153657 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.153681 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.153712 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.153736 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:18Z","lastTransitionTime":"2025-11-25T15:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.256236 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.256279 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.256291 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.256306 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.256318 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:18Z","lastTransitionTime":"2025-11-25T15:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.363287 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.363333 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.363344 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.363360 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.363370 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:18Z","lastTransitionTime":"2025-11-25T15:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.465567 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.465630 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.465641 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.465657 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.465667 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:18Z","lastTransitionTime":"2025-11-25T15:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.568001 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.568031 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.568041 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.568054 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.568064 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:18Z","lastTransitionTime":"2025-11-25T15:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.670268 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.670320 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.670329 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.670342 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.670350 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:18Z","lastTransitionTime":"2025-11-25T15:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.772836 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.772898 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.772916 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.772941 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.773083 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:18Z","lastTransitionTime":"2025-11-25T15:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.876362 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.876408 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.876423 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.876436 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.876446 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:18Z","lastTransitionTime":"2025-11-25T15:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.979335 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.979396 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.979409 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.979428 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:18 crc kubenswrapper[4965]: I1125 15:05:18.979442 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:18Z","lastTransitionTime":"2025-11-25T15:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.082728 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.082767 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.082775 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.082795 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.082804 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:19Z","lastTransitionTime":"2025-11-25T15:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.188093 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.188160 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.188196 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.188243 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.188270 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:19Z","lastTransitionTime":"2025-11-25T15:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.292041 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.292091 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.292108 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.292167 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.292185 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:19Z","lastTransitionTime":"2025-11-25T15:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.395358 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.395420 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.395437 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.395459 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.395476 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:19Z","lastTransitionTime":"2025-11-25T15:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.498651 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.498724 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.498741 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.498768 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.498787 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:19Z","lastTransitionTime":"2025-11-25T15:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.602522 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.602572 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.602583 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.602602 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.602615 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:19Z","lastTransitionTime":"2025-11-25T15:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.705041 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.705087 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.705102 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.705121 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.705134 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:19Z","lastTransitionTime":"2025-11-25T15:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.771464 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.771513 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.771548 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.771612 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:19 crc kubenswrapper[4965]: E1125 15:05:19.771608 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:19 crc kubenswrapper[4965]: E1125 15:05:19.771847 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:19 crc kubenswrapper[4965]: E1125 15:05:19.771877 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:19 crc kubenswrapper[4965]: E1125 15:05:19.772047 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.808235 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.808333 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.808351 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.808372 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.808390 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:19Z","lastTransitionTime":"2025-11-25T15:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.910784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.910816 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.910823 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.910858 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:19 crc kubenswrapper[4965]: I1125 15:05:19.910872 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:19Z","lastTransitionTime":"2025-11-25T15:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.013486 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.013542 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.013558 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.013583 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.013602 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:20Z","lastTransitionTime":"2025-11-25T15:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.116467 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.116510 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.116522 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.116541 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.116553 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:20Z","lastTransitionTime":"2025-11-25T15:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.222473 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.223874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.223924 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.223949 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.223994 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:20Z","lastTransitionTime":"2025-11-25T15:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.325931 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.326043 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.326062 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.326083 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.326099 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:20Z","lastTransitionTime":"2025-11-25T15:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.427692 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.427732 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.427744 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.427760 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.427772 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:20Z","lastTransitionTime":"2025-11-25T15:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.530774 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.530837 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.530855 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.530879 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.530897 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:20Z","lastTransitionTime":"2025-11-25T15:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.633281 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.633315 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.633326 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.633342 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.633353 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:20Z","lastTransitionTime":"2025-11-25T15:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.735889 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.735929 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.735938 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.735952 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.735982 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:20Z","lastTransitionTime":"2025-11-25T15:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.838065 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.838091 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.838100 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.838111 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.838119 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:20Z","lastTransitionTime":"2025-11-25T15:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.940181 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.940206 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.940214 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.940226 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:20 crc kubenswrapper[4965]: I1125 15:05:20.940234 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:20Z","lastTransitionTime":"2025-11-25T15:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.042044 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.042070 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.042077 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.042090 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.042100 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:21Z","lastTransitionTime":"2025-11-25T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.144168 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.144203 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.144213 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.144230 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.144243 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:21Z","lastTransitionTime":"2025-11-25T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.246413 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.246466 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.246475 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.246487 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.246497 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:21Z","lastTransitionTime":"2025-11-25T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.348882 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.348916 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.348925 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.348938 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.348947 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:21Z","lastTransitionTime":"2025-11-25T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.450743 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.450795 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.450807 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.450822 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.450832 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:21Z","lastTransitionTime":"2025-11-25T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.553506 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.553550 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.553561 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.553583 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.553595 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:21Z","lastTransitionTime":"2025-11-25T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.656085 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.656117 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.656126 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.656143 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.656155 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:21Z","lastTransitionTime":"2025-11-25T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.758350 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.758394 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.758404 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.758420 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.758429 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:21Z","lastTransitionTime":"2025-11-25T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.778595 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.778622 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.778652 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.778614 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:21 crc kubenswrapper[4965]: E1125 15:05:21.778738 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:21 crc kubenswrapper[4965]: E1125 15:05:21.778850 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:21 crc kubenswrapper[4965]: E1125 15:05:21.778942 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:21 crc kubenswrapper[4965]: E1125 15:05:21.779026 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.860403 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.860441 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.860450 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.860463 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.860474 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:21Z","lastTransitionTime":"2025-11-25T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.962241 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.962493 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.962562 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.962632 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:21 crc kubenswrapper[4965]: I1125 15:05:21.962711 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:21Z","lastTransitionTime":"2025-11-25T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.065364 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.065427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.065444 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.065469 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.065486 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:22Z","lastTransitionTime":"2025-11-25T15:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.168420 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.168839 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.169022 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.169194 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.169329 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:22Z","lastTransitionTime":"2025-11-25T15:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.271360 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.271395 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.271403 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.271419 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.271431 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:22Z","lastTransitionTime":"2025-11-25T15:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.373487 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.373529 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.373537 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.373551 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.373560 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:22Z","lastTransitionTime":"2025-11-25T15:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.476784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.476838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.476854 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.476876 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.476892 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:22Z","lastTransitionTime":"2025-11-25T15:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.579571 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.579634 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.579658 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.579681 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.579695 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:22Z","lastTransitionTime":"2025-11-25T15:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.681633 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.681667 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.681675 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.681690 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.681700 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:22Z","lastTransitionTime":"2025-11-25T15:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.783406 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.783438 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.783448 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.783462 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.783472 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:22Z","lastTransitionTime":"2025-11-25T15:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.885780 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.886168 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.886188 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.886204 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.886215 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:22Z","lastTransitionTime":"2025-11-25T15:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.989002 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.989043 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.989054 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.989071 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:22 crc kubenswrapper[4965]: I1125 15:05:22.989084 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:22Z","lastTransitionTime":"2025-11-25T15:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.091502 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.091525 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.091533 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.091545 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.091567 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:23Z","lastTransitionTime":"2025-11-25T15:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.193820 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.193852 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.193860 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.193872 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.193881 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:23Z","lastTransitionTime":"2025-11-25T15:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.296055 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.296110 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.296129 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.296148 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.296161 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:23Z","lastTransitionTime":"2025-11-25T15:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.398650 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.398687 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.398699 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.398714 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.398723 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:23Z","lastTransitionTime":"2025-11-25T15:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.501422 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.501459 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.501468 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.501482 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.501491 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:23Z","lastTransitionTime":"2025-11-25T15:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.604284 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.604392 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.604423 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.604441 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.604453 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:23Z","lastTransitionTime":"2025-11-25T15:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.707403 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.707467 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.707486 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.707528 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.707541 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:23Z","lastTransitionTime":"2025-11-25T15:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.771197 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.771253 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.771197 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:23 crc kubenswrapper[4965]: E1125 15:05:23.771383 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.771217 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:23 crc kubenswrapper[4965]: E1125 15:05:23.771317 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:23 crc kubenswrapper[4965]: E1125 15:05:23.771489 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:23 crc kubenswrapper[4965]: E1125 15:05:23.771544 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.810234 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.810283 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.810292 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.810304 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.810314 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:23Z","lastTransitionTime":"2025-11-25T15:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.914190 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.914229 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.914238 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.914251 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.914260 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:23Z","lastTransitionTime":"2025-11-25T15:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:23 crc kubenswrapper[4965]: I1125 15:05:23.961638 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:23 crc kubenswrapper[4965]: E1125 15:05:23.961781 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:05:23 crc kubenswrapper[4965]: E1125 15:05:23.961830 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs podName:6ed72551-610b-4f03-8a57-319ef27e27e0 nodeName:}" failed. No retries permitted until 2025-11-25 15:05:55.961815049 +0000 UTC m=+100.929408795 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs") pod "network-metrics-daemon-j87z5" (UID: "6ed72551-610b-4f03-8a57-319ef27e27e0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.016503 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.016535 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.016544 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.016563 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.016574 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.119141 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.119175 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.119183 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.119196 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.119205 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.221462 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.221489 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.221497 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.221510 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.221519 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.323997 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.324029 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.324039 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.324055 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.324068 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.426353 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.426388 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.426401 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.426419 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.426431 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.529593 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.529625 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.529635 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.529651 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.529662 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.632404 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.632437 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.632445 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.632461 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.632470 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.734832 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.734864 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.734873 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.734887 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.734895 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.819920 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.819984 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.819995 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.820012 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.820025 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: E1125 15:05:24.832384 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.835396 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.835439 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.835450 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.835466 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.835478 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: E1125 15:05:24.846250 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.849316 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.849347 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.849366 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.849386 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.849395 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: E1125 15:05:24.859294 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.862331 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.862367 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.862376 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.862392 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.862403 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: E1125 15:05:24.873348 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.876361 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.876472 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.876553 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.876619 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.876693 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: E1125 15:05:24.887314 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:24 crc kubenswrapper[4965]: E1125 15:05:24.887715 4965 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.889166 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.889207 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.889219 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.889236 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.889248 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.991918 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.991956 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.991987 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.992003 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:24 crc kubenswrapper[4965]: I1125 15:05:24.992015 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:24Z","lastTransitionTime":"2025-11-25T15:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.094567 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.094613 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.094628 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.094646 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.094660 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:25Z","lastTransitionTime":"2025-11-25T15:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.200952 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.201024 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.201036 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.201051 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.201061 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:25Z","lastTransitionTime":"2025-11-25T15:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.304245 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.304295 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.304308 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.304325 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.304336 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:25Z","lastTransitionTime":"2025-11-25T15:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.406951 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.407076 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.407100 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.407133 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.407155 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:25Z","lastTransitionTime":"2025-11-25T15:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.510652 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.510714 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.510727 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.510744 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.510755 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:25Z","lastTransitionTime":"2025-11-25T15:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.613801 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.613836 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.613846 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.613860 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.613870 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:25Z","lastTransitionTime":"2025-11-25T15:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.715803 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.715835 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.715843 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.715861 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.715871 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:25Z","lastTransitionTime":"2025-11-25T15:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.770793 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.770852 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.770877 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:25 crc kubenswrapper[4965]: E1125 15:05:25.771004 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:25 crc kubenswrapper[4965]: E1125 15:05:25.771125 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.771475 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:25 crc kubenswrapper[4965]: E1125 15:05:25.771581 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:25 crc kubenswrapper[4965]: E1125 15:05:25.771676 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.772033 4965 scope.go:117] "RemoveContainer" containerID="25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78" Nov 25 15:05:25 crc kubenswrapper[4965]: E1125 15:05:25.772233 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.818137 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.818169 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.818177 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.818190 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.818199 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:25Z","lastTransitionTime":"2025-11-25T15:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.920474 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.920515 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.920526 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.920542 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:25 crc kubenswrapper[4965]: I1125 15:05:25.920555 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:25Z","lastTransitionTime":"2025-11-25T15:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.023090 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.023136 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.023147 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.023163 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.023177 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:26Z","lastTransitionTime":"2025-11-25T15:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.125443 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.125481 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.125489 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.125508 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.125521 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:26Z","lastTransitionTime":"2025-11-25T15:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.197911 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8jdpp_7de2930c-eabd-4919-b214-30b0c83141f7/kube-multus/0.log" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.197980 4965 generic.go:334] "Generic (PLEG): container finished" podID="7de2930c-eabd-4919-b214-30b0c83141f7" containerID="af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99" exitCode=1 Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.198013 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8jdpp" event={"ID":"7de2930c-eabd-4919-b214-30b0c83141f7","Type":"ContainerDied","Data":"af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.198371 4965 scope.go:117] "RemoveContainer" containerID="af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.216140 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.227446 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.227477 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.227486 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.227500 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.227511 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:26Z","lastTransitionTime":"2025-11-25T15:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.228097 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.245171 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:04:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad\\\\n2025-11-25T15:04:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad to /host/opt/cni/bin/\\\\n2025-11-25T15:04:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:04:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:05:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.258963 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.269139 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.281372 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.293054 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.310995 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"message\\\":\\\"0\\\\nI1125 15:05:10.592499 6555 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 15:05:10.592705 6555 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593333 6555 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593479 6555 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593723 6555 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593831 6555 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.594194 6555 factory.go:656] Stopping watch factory\\\\nI1125 15:05:10.645914 6555 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:05:10.645939 6555 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:05:10.646007 6555 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:05:10.646044 6555 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:05:10.646137 6555 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.327597 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.329579 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.329621 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.329633 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.329650 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.329662 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:26Z","lastTransitionTime":"2025-11-25T15:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.342088 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.359000 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.369697 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.382874 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.393400 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.402103 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.414037 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.424545 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.431627 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.431661 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.431670 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.431683 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.431693 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:26Z","lastTransitionTime":"2025-11-25T15:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.446415 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.533301 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.533346 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.533359 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.533377 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.533391 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:26Z","lastTransitionTime":"2025-11-25T15:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.635057 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.635097 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.635109 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.635125 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.635136 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:26Z","lastTransitionTime":"2025-11-25T15:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.737166 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.737251 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.737270 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.737296 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.737314 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:26Z","lastTransitionTime":"2025-11-25T15:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.790913 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.821795 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"message\\\":\\\"0\\\\nI1125 15:05:10.592499 6555 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 15:05:10.592705 6555 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593333 6555 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593479 6555 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593723 6555 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593831 6555 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.594194 6555 factory.go:656] Stopping watch factory\\\\nI1125 15:05:10.645914 6555 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:05:10.645939 6555 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:05:10.646007 6555 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:05:10.646044 6555 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:05:10.646137 6555 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.838357 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.839475 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.839511 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.839521 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.839537 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.839548 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:26Z","lastTransitionTime":"2025-11-25T15:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.855823 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.871632 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.881587 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.894948 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.907659 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.918511 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.931390 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.942378 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.942414 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.942424 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.942439 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.942449 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:26Z","lastTransitionTime":"2025-11-25T15:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.943364 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.973958 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:26 crc kubenswrapper[4965]: I1125 15:05:26.993225 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.008367 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.022617 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:04:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad\\\\n2025-11-25T15:04:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad to /host/opt/cni/bin/\\\\n2025-11-25T15:04:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:04:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:05:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.034879 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.044220 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.044274 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.044282 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.044298 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.044310 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:27Z","lastTransitionTime":"2025-11-25T15:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.047338 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.058374 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.146850 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.146932 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.146957 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.147034 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.147059 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:27Z","lastTransitionTime":"2025-11-25T15:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.202494 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8jdpp_7de2930c-eabd-4919-b214-30b0c83141f7/kube-multus/0.log" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.202779 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8jdpp" event={"ID":"7de2930c-eabd-4919-b214-30b0c83141f7","Type":"ContainerStarted","Data":"837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca"} Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.218495 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.236205 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.247560 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.248912 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.248952 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.248984 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.249003 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.249015 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:27Z","lastTransitionTime":"2025-11-25T15:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.262633 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.288719 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"message\\\":\\\"0\\\\nI1125 15:05:10.592499 6555 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 15:05:10.592705 6555 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593333 6555 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593479 6555 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593723 6555 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593831 6555 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.594194 6555 factory.go:656] Stopping watch factory\\\\nI1125 15:05:10.645914 6555 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:05:10.645939 6555 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:05:10.646007 6555 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:05:10.646044 6555 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:05:10.646137 6555 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.297568 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.308157 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.318640 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.335002 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.345800 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.351030 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.351065 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.351076 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.351091 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.351102 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:27Z","lastTransitionTime":"2025-11-25T15:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.358352 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.368546 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.388299 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.404096 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.418152 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.433864 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.446503 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:04:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad\\\\n2025-11-25T15:04:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad to /host/opt/cni/bin/\\\\n2025-11-25T15:04:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:04:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:05:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.453652 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.453690 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.453700 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.453715 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.453724 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:27Z","lastTransitionTime":"2025-11-25T15:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.455897 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.555871 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.555914 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.555929 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.555948 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.555985 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:27Z","lastTransitionTime":"2025-11-25T15:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.660018 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.660059 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.660071 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.660088 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.660101 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:27Z","lastTransitionTime":"2025-11-25T15:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.762519 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.762553 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.762564 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.762577 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.762589 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:27Z","lastTransitionTime":"2025-11-25T15:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.771112 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.771116 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.771120 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.771180 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:27 crc kubenswrapper[4965]: E1125 15:05:27.771587 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:27 crc kubenswrapper[4965]: E1125 15:05:27.771677 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:27 crc kubenswrapper[4965]: E1125 15:05:27.771723 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:27 crc kubenswrapper[4965]: E1125 15:05:27.771978 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.782887 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.864528 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.864581 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.864593 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.864610 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.864621 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:27Z","lastTransitionTime":"2025-11-25T15:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.966891 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.966923 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.966933 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.966945 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:27 crc kubenswrapper[4965]: I1125 15:05:27.966955 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:27Z","lastTransitionTime":"2025-11-25T15:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.069362 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.069433 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.069453 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.069481 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.069498 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:28Z","lastTransitionTime":"2025-11-25T15:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.172637 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.172676 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.172690 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.172716 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.172727 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:28Z","lastTransitionTime":"2025-11-25T15:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.274904 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.275049 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.275077 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.275101 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.275119 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:28Z","lastTransitionTime":"2025-11-25T15:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.377035 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.377071 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.377079 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.377093 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.377103 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:28Z","lastTransitionTime":"2025-11-25T15:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.479319 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.479356 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.479368 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.479383 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.479394 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:28Z","lastTransitionTime":"2025-11-25T15:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.581101 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.581142 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.581156 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.581170 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.581183 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:28Z","lastTransitionTime":"2025-11-25T15:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.683198 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.683240 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.683251 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.683269 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.683281 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:28Z","lastTransitionTime":"2025-11-25T15:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.953591 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.953634 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.953644 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.953662 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:28 crc kubenswrapper[4965]: I1125 15:05:28.953673 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:28Z","lastTransitionTime":"2025-11-25T15:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.055365 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.055404 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.055414 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.055428 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.055438 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:29Z","lastTransitionTime":"2025-11-25T15:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.157640 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.157674 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.157685 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.157701 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.157712 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:29Z","lastTransitionTime":"2025-11-25T15:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.260064 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.260098 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.260137 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.260169 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.260178 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:29Z","lastTransitionTime":"2025-11-25T15:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.362301 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.362338 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.362348 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.362361 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.362371 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:29Z","lastTransitionTime":"2025-11-25T15:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.467733 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.467796 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.467809 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.467829 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.467839 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:29Z","lastTransitionTime":"2025-11-25T15:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.570630 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.570688 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.570699 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.570712 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.570722 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:29Z","lastTransitionTime":"2025-11-25T15:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.673091 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.673127 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.673136 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.673150 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.673160 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:29Z","lastTransitionTime":"2025-11-25T15:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.771376 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.771440 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.771387 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:29 crc kubenswrapper[4965]: E1125 15:05:29.771549 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.771610 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:29 crc kubenswrapper[4965]: E1125 15:05:29.771707 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:29 crc kubenswrapper[4965]: E1125 15:05:29.771781 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:29 crc kubenswrapper[4965]: E1125 15:05:29.771846 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.775750 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.775777 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.775788 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.775840 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.775857 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:29Z","lastTransitionTime":"2025-11-25T15:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.878922 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.878955 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.878980 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.878994 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.879004 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:29Z","lastTransitionTime":"2025-11-25T15:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.981424 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.981462 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.981470 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.981485 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:29 crc kubenswrapper[4965]: I1125 15:05:29.981496 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:29Z","lastTransitionTime":"2025-11-25T15:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.084538 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.084598 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.084614 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.084636 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.084652 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:30Z","lastTransitionTime":"2025-11-25T15:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.187807 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.187869 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.187887 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.187910 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.187927 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:30Z","lastTransitionTime":"2025-11-25T15:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.290336 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.290373 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.290383 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.290397 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.290408 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:30Z","lastTransitionTime":"2025-11-25T15:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.393052 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.393088 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.393100 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.393117 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.393128 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:30Z","lastTransitionTime":"2025-11-25T15:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.495901 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.495936 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.495944 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.495957 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.495991 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:30Z","lastTransitionTime":"2025-11-25T15:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.598743 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.598791 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.598804 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.598822 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.598834 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:30Z","lastTransitionTime":"2025-11-25T15:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.701438 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.701477 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.701485 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.701500 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.701512 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:30Z","lastTransitionTime":"2025-11-25T15:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.803618 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.803657 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.803668 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.803684 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.803696 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:30Z","lastTransitionTime":"2025-11-25T15:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.906169 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.906218 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.906242 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.906264 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:30 crc kubenswrapper[4965]: I1125 15:05:30.906278 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:30Z","lastTransitionTime":"2025-11-25T15:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.009900 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.009997 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.010023 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.010052 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.010075 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:31Z","lastTransitionTime":"2025-11-25T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.114144 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.114359 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.114381 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.114416 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.114436 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:31Z","lastTransitionTime":"2025-11-25T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.216711 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.216785 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.216807 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.216838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.216859 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:31Z","lastTransitionTime":"2025-11-25T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.319771 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.319811 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.319820 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.319834 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.319843 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:31Z","lastTransitionTime":"2025-11-25T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.423125 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.423169 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.423179 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.423192 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.423200 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:31Z","lastTransitionTime":"2025-11-25T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.526036 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.526084 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.526099 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.526119 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.526132 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:31Z","lastTransitionTime":"2025-11-25T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.628767 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.628854 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.628886 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.628923 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.628945 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:31Z","lastTransitionTime":"2025-11-25T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.731738 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.731777 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.731788 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.731802 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.731812 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:31Z","lastTransitionTime":"2025-11-25T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.771266 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.771339 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.771479 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:31 crc kubenswrapper[4965]: E1125 15:05:31.771468 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.771548 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:31 crc kubenswrapper[4965]: E1125 15:05:31.771709 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:31 crc kubenswrapper[4965]: E1125 15:05:31.771767 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:31 crc kubenswrapper[4965]: E1125 15:05:31.771821 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.834850 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.834890 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.834902 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.834920 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.834931 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:31Z","lastTransitionTime":"2025-11-25T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.938081 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.938204 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.938230 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.938261 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:31 crc kubenswrapper[4965]: I1125 15:05:31.938284 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:31Z","lastTransitionTime":"2025-11-25T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.041591 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.041636 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.041647 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.041662 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.041671 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:32Z","lastTransitionTime":"2025-11-25T15:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.144530 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.144588 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.144606 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.144636 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.144653 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:32Z","lastTransitionTime":"2025-11-25T15:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.247581 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.247707 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.247738 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.247766 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.247786 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:32Z","lastTransitionTime":"2025-11-25T15:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.350601 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.350647 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.350659 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.350676 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.350689 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:32Z","lastTransitionTime":"2025-11-25T15:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.452638 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.452675 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.452684 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.452699 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.452708 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:32Z","lastTransitionTime":"2025-11-25T15:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.555232 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.555291 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.555304 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.555318 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.555329 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:32Z","lastTransitionTime":"2025-11-25T15:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.661659 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.661723 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.661747 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.661777 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.661799 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:32Z","lastTransitionTime":"2025-11-25T15:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.764577 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.764621 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.764630 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.764645 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.764655 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:32Z","lastTransitionTime":"2025-11-25T15:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.867297 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.867352 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.867364 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.867381 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.867394 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:32Z","lastTransitionTime":"2025-11-25T15:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.970373 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.970412 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.970421 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.970436 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:32 crc kubenswrapper[4965]: I1125 15:05:32.970446 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:32Z","lastTransitionTime":"2025-11-25T15:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.073753 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.073834 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.073857 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.073884 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.073906 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:33Z","lastTransitionTime":"2025-11-25T15:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.176630 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.176673 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.176688 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.176707 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.176720 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:33Z","lastTransitionTime":"2025-11-25T15:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.279361 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.279469 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.279494 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.279525 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.279546 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:33Z","lastTransitionTime":"2025-11-25T15:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.384785 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.384845 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.384863 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.384891 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.384911 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:33Z","lastTransitionTime":"2025-11-25T15:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.488021 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.488079 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.488092 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.488112 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.488127 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:33Z","lastTransitionTime":"2025-11-25T15:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.590654 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.590689 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.590700 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.590715 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.590725 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:33Z","lastTransitionTime":"2025-11-25T15:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.692632 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.692672 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.692681 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.692695 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.692705 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:33Z","lastTransitionTime":"2025-11-25T15:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.770848 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.770844 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.770895 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.771085 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:33 crc kubenswrapper[4965]: E1125 15:05:33.771258 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:33 crc kubenswrapper[4965]: E1125 15:05:33.771359 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:33 crc kubenswrapper[4965]: E1125 15:05:33.771425 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:33 crc kubenswrapper[4965]: E1125 15:05:33.771578 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.795285 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.795342 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.795359 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.795383 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.795400 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:33Z","lastTransitionTime":"2025-11-25T15:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.897248 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.897313 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.897332 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.897360 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:33 crc kubenswrapper[4965]: I1125 15:05:33.897377 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:33Z","lastTransitionTime":"2025-11-25T15:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.000749 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.000824 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.000845 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.000873 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.000892 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:34Z","lastTransitionTime":"2025-11-25T15:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.103499 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.103565 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.103576 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.103595 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.103608 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:34Z","lastTransitionTime":"2025-11-25T15:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.206864 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.206905 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.206916 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.206933 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.206945 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:34Z","lastTransitionTime":"2025-11-25T15:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.310646 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.310765 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.310828 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.310860 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.310917 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:34Z","lastTransitionTime":"2025-11-25T15:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.415343 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.415427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.415445 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.415500 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.415517 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:34Z","lastTransitionTime":"2025-11-25T15:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.518562 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.518672 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.518696 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.518724 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.518740 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:34Z","lastTransitionTime":"2025-11-25T15:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.622422 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.622515 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.622531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.622553 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.622570 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:34Z","lastTransitionTime":"2025-11-25T15:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.725446 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.725488 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.725502 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.725551 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.725562 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:34Z","lastTransitionTime":"2025-11-25T15:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.828167 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.828226 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.828248 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.828275 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.828296 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:34Z","lastTransitionTime":"2025-11-25T15:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.931944 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.932042 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.932061 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.932087 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:34 crc kubenswrapper[4965]: I1125 15:05:34.932111 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:34Z","lastTransitionTime":"2025-11-25T15:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.035409 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.035476 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.035525 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.035553 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.035570 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.138139 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.138176 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.138184 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.138197 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.138207 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.228545 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.228580 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.228589 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.228621 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.228632 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: E1125 15:05:35.245731 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.250477 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.250517 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.250531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.250549 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.250561 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: E1125 15:05:35.263076 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.267483 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.267527 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.267543 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.267563 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.267579 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: E1125 15:05:35.281931 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.285842 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.285921 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.285931 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.285945 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.285954 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: E1125 15:05:35.297143 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.301053 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.301088 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.301101 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.301120 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.301135 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: E1125 15:05:35.312712 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:35 crc kubenswrapper[4965]: E1125 15:05:35.312879 4965 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.315240 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.315440 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.315476 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.315568 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.315625 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.419474 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.419535 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.419551 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.419575 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.419591 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.522309 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.522370 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.522390 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.522415 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.522432 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.625693 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.625754 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.625770 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.625796 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.625815 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.728623 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.728671 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.728683 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.728700 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.728712 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.771314 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.771371 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.771323 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:35 crc kubenswrapper[4965]: E1125 15:05:35.771508 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:35 crc kubenswrapper[4965]: E1125 15:05:35.771605 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:35 crc kubenswrapper[4965]: E1125 15:05:35.771819 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.771840 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:35 crc kubenswrapper[4965]: E1125 15:05:35.772082 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.832401 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.832451 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.832463 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.832484 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.832498 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.935400 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.935435 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.935446 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.935461 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:35 crc kubenswrapper[4965]: I1125 15:05:35.935472 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:35Z","lastTransitionTime":"2025-11-25T15:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.038417 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.038454 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.038464 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.038479 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.038493 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:36Z","lastTransitionTime":"2025-11-25T15:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.140741 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.140809 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.140831 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.140858 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.140880 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:36Z","lastTransitionTime":"2025-11-25T15:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.244448 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.244510 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.244524 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.244552 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.244567 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:36Z","lastTransitionTime":"2025-11-25T15:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.348024 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.348078 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.348093 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.348118 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.348135 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:36Z","lastTransitionTime":"2025-11-25T15:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.450370 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.450410 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.450422 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.450446 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.450470 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:36Z","lastTransitionTime":"2025-11-25T15:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.553769 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.553833 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.553850 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.553872 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.553888 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:36Z","lastTransitionTime":"2025-11-25T15:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.656576 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.656615 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.656626 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.656641 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.656650 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:36Z","lastTransitionTime":"2025-11-25T15:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.760027 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.760071 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.760082 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.760101 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.760113 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:36Z","lastTransitionTime":"2025-11-25T15:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.791377 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.816125 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.830188 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.853052 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.862630 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.862659 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.862669 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.862685 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.862697 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:36Z","lastTransitionTime":"2025-11-25T15:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.868927 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"message\\\":\\\"0\\\\nI1125 15:05:10.592499 6555 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 15:05:10.592705 6555 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593333 6555 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593479 6555 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593723 6555 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593831 6555 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.594194 6555 factory.go:656] Stopping watch factory\\\\nI1125 15:05:10.645914 6555 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:05:10.645939 6555 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:05:10.646007 6555 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:05:10.646044 6555 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:05:10.646137 6555 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.877104 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.886571 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.897176 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.913947 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.924724 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.935205 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.943732 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.956183 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.965398 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.965450 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.965466 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.965492 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.965510 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:36Z","lastTransitionTime":"2025-11-25T15:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.968669 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dacb07a-db92-4294-a7e4-1012bbe6c9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf7c955a7ce02cd5af2d3b7f447e561eea55e5104fa2eb07d863a90a82b0809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.980911 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:36 crc kubenswrapper[4965]: I1125 15:05:36.992271 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.001502 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.014740 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:04:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad\\\\n2025-11-25T15:04:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad to /host/opt/cni/bin/\\\\n2025-11-25T15:04:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:04:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:05:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.025434 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.068668 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.068705 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.068717 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.068738 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.068753 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:37Z","lastTransitionTime":"2025-11-25T15:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.171599 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.171676 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.171687 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.171704 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.171715 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:37Z","lastTransitionTime":"2025-11-25T15:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.275100 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.275161 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.275182 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.275211 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.275242 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:37Z","lastTransitionTime":"2025-11-25T15:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.377916 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.378026 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.378047 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.378073 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.378090 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:37Z","lastTransitionTime":"2025-11-25T15:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.480206 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.480302 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.480328 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.480361 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.480385 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:37Z","lastTransitionTime":"2025-11-25T15:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.594250 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.594311 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.594328 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.594352 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.594372 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:37Z","lastTransitionTime":"2025-11-25T15:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.697589 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.697653 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.697671 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.697698 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.697713 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:37Z","lastTransitionTime":"2025-11-25T15:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.771190 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.771264 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.771403 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:37 crc kubenswrapper[4965]: E1125 15:05:37.771391 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.771456 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:37 crc kubenswrapper[4965]: E1125 15:05:37.771591 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.772671 4965 scope.go:117] "RemoveContainer" containerID="25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78" Nov 25 15:05:37 crc kubenswrapper[4965]: E1125 15:05:37.773210 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:37 crc kubenswrapper[4965]: E1125 15:05:37.773427 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.800731 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.800788 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.800807 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.800831 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.800848 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:37Z","lastTransitionTime":"2025-11-25T15:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.903397 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.903451 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.903469 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.903496 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:37 crc kubenswrapper[4965]: I1125 15:05:37.903514 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:37Z","lastTransitionTime":"2025-11-25T15:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.007011 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.007512 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.007533 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.007560 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.007577 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:38Z","lastTransitionTime":"2025-11-25T15:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.110121 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.110174 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.110191 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.110214 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.110230 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:38Z","lastTransitionTime":"2025-11-25T15:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.212440 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.212491 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.212506 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.212529 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.212544 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:38Z","lastTransitionTime":"2025-11-25T15:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.241240 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/2.log" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.244722 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.245689 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.264911 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.278947 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.291735 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.305024 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:04:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad\\\\n2025-11-25T15:04:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad to /host/opt/cni/bin/\\\\n2025-11-25T15:04:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:04:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:05:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.315161 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.315399 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.315412 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.315433 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.315448 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:38Z","lastTransitionTime":"2025-11-25T15:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.318906 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.328963 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dacb07a-db92-4294-a7e4-1012bbe6c9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf7c955a7ce02cd5af2d3b7f447e561eea55e5104fa2eb07d863a90a82b0809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.343434 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.352831 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.365761 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.384933 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"message\\\":\\\"0\\\\nI1125 15:05:10.592499 6555 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 15:05:10.592705 6555 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593333 6555 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593479 6555 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593723 6555 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593831 6555 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.594194 6555 factory.go:656] Stopping watch factory\\\\nI1125 15:05:10.645914 6555 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:05:10.645939 6555 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:05:10.646007 6555 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:05:10.646044 6555 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:05:10.646137 6555 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.395180 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.407070 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.418957 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.419478 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.419507 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.419517 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.419531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.419542 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:38Z","lastTransitionTime":"2025-11-25T15:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.435735 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.448868 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.460874 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.469562 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.483278 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.497652 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.521473 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.521503 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.521514 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.521531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.521544 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:38Z","lastTransitionTime":"2025-11-25T15:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.623605 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.623650 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.623659 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.623673 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.623681 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:38Z","lastTransitionTime":"2025-11-25T15:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.725782 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.725827 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.725842 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.725861 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.725872 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:38Z","lastTransitionTime":"2025-11-25T15:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.828420 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.828455 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.828465 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.828480 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.828491 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:38Z","lastTransitionTime":"2025-11-25T15:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.932018 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.932050 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.932058 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.932074 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:38 crc kubenswrapper[4965]: I1125 15:05:38.932083 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:38Z","lastTransitionTime":"2025-11-25T15:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.034384 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.034418 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.034428 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.034443 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.034456 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:39Z","lastTransitionTime":"2025-11-25T15:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.136839 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.136868 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.136875 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.136887 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.136897 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:39Z","lastTransitionTime":"2025-11-25T15:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.242186 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.242246 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.242265 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.242290 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.242306 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:39Z","lastTransitionTime":"2025-11-25T15:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.251320 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/3.log" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.252294 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/2.log" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.256735 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1" exitCode=1 Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.256782 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.256845 4965 scope.go:117] "RemoveContainer" containerID="25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.258503 4965 scope.go:117] "RemoveContainer" containerID="767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1" Nov 25 15:05:39 crc kubenswrapper[4965]: E1125 15:05:39.259252 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.280493 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.300396 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.329846 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.341709 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.345373 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.345412 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.345428 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.345445 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.345457 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:39Z","lastTransitionTime":"2025-11-25T15:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.354614 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.366034 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.380795 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.392113 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dacb07a-db92-4294-a7e4-1012bbe6c9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf7c955a7ce02cd5af2d3b7f447e561eea55e5104fa2eb07d863a90a82b0809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.408661 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.422335 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.433586 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.448686 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.448871 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.448994 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.449097 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.449178 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:39Z","lastTransitionTime":"2025-11-25T15:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.457196 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:04:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad\\\\n2025-11-25T15:04:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad to /host/opt/cni/bin/\\\\n2025-11-25T15:04:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:04:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:05:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.474101 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.488682 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.501250 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.514550 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.535261 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.555240 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.555573 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.555723 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.555863 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.556035 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:39Z","lastTransitionTime":"2025-11-25T15:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.577208 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25802efffddf38ca92a05b878fe20b0eb4c1daa05635066cf40a8965241d0e78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"message\\\":\\\"0\\\\nI1125 15:05:10.592499 6555 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 15:05:10.592705 6555 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593333 6555 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593479 6555 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593723 6555 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.593831 6555 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:05:10.594194 6555 factory.go:656] Stopping watch factory\\\\nI1125 15:05:10.645914 6555 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1125 15:05:10.645939 6555 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1125 15:05:10.646007 6555 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:05:10.646044 6555 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1125 15:05:10.646137 6555 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:38Z\\\",\\\"message\\\":\\\"ty-vrzqb after 0 failed attempt(s)\\\\nI1125 15:05:38.979282 6906 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1125 15:05:38.978750 6906 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wpjkp in node crc\\\\nI1125 15:05:38.979333 6906 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 15:05:38.979361 6906 services_controller.go:452] Built service openshift-machine-api/machine-api-operator-machine-webhook per-node LB for network=default: []services.LB{}\\\\nI1125 15:05:38.979370 6906 serv\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.600734 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.658506 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.658556 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.658572 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.658594 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.658612 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:39Z","lastTransitionTime":"2025-11-25T15:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.761160 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.761207 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.761219 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.761237 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.761249 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:39Z","lastTransitionTime":"2025-11-25T15:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.771410 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.771465 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.771476 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:39 crc kubenswrapper[4965]: E1125 15:05:39.771959 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:39 crc kubenswrapper[4965]: E1125 15:05:39.772282 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:39 crc kubenswrapper[4965]: E1125 15:05:39.772128 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.772503 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:39 crc kubenswrapper[4965]: E1125 15:05:39.772734 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.863666 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.864519 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.864666 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.864836 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.865042 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:39Z","lastTransitionTime":"2025-11-25T15:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.971308 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.971386 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.971408 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.971439 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:39 crc kubenswrapper[4965]: I1125 15:05:39.971461 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:39Z","lastTransitionTime":"2025-11-25T15:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.074272 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.074623 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.074766 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.074912 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.075088 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:40Z","lastTransitionTime":"2025-11-25T15:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.178506 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.178565 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.178580 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.178601 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.178615 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:40Z","lastTransitionTime":"2025-11-25T15:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.260492 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/3.log" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.264445 4965 scope.go:117] "RemoveContainer" containerID="767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1" Nov 25 15:05:40 crc kubenswrapper[4965]: E1125 15:05:40.265044 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.279237 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.280790 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.281100 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.281477 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.281888 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.282204 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:40Z","lastTransitionTime":"2025-11-25T15:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.312406 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.323783 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.334485 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.342699 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.354412 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.365327 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.375369 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.384708 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.384737 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.384745 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.384758 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.384768 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:40Z","lastTransitionTime":"2025-11-25T15:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.385464 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.396038 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.410402 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:04:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad\\\\n2025-11-25T15:04:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad to /host/opt/cni/bin/\\\\n2025-11-25T15:04:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:04:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:05:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.421714 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.430585 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dacb07a-db92-4294-a7e4-1012bbe6c9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf7c955a7ce02cd5af2d3b7f447e561eea55e5104fa2eb07d863a90a82b0809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.442712 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.452887 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.465614 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.482164 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:38Z\\\",\\\"message\\\":\\\"ty-vrzqb after 0 failed attempt(s)\\\\nI1125 15:05:38.979282 6906 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1125 15:05:38.978750 6906 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wpjkp in node crc\\\\nI1125 15:05:38.979333 6906 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 15:05:38.979361 6906 services_controller.go:452] Built service openshift-machine-api/machine-api-operator-machine-webhook per-node LB for network=default: []services.LB{}\\\\nI1125 15:05:38.979370 6906 serv\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.486583 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.486617 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.486628 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.486642 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.486653 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:40Z","lastTransitionTime":"2025-11-25T15:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.491260 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.502294 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.589239 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.589280 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.589289 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.589303 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.589315 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:40Z","lastTransitionTime":"2025-11-25T15:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.692918 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.693025 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.693039 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.693056 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.693069 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:40Z","lastTransitionTime":"2025-11-25T15:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.795850 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.795936 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.795954 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.796015 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.796036 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:40Z","lastTransitionTime":"2025-11-25T15:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.899790 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.899841 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.899857 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.899880 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:40 crc kubenswrapper[4965]: I1125 15:05:40.899898 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:40Z","lastTransitionTime":"2025-11-25T15:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.003248 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.003291 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.003303 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.003320 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.003333 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:41Z","lastTransitionTime":"2025-11-25T15:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.106721 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.106833 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.106851 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.106915 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.106945 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:41Z","lastTransitionTime":"2025-11-25T15:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.209390 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.209444 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.209461 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.209482 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.209500 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:41Z","lastTransitionTime":"2025-11-25T15:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.312375 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.312438 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.312449 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.312468 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.312480 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:41Z","lastTransitionTime":"2025-11-25T15:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.420428 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.420461 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.420469 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.420481 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.420490 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:41Z","lastTransitionTime":"2025-11-25T15:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.522748 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.522799 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.522815 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.522838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.522854 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:41Z","lastTransitionTime":"2025-11-25T15:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.625011 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.625056 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.625066 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.625082 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.625093 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:41Z","lastTransitionTime":"2025-11-25T15:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.702177 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.702294 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.702351 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.702402 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.702437 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702478 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.702443892 +0000 UTC m=+150.670037678 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702565 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702561 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702607 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702615 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.702603465 +0000 UTC m=+150.670197231 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702623 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702561 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702657 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702677 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702697 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.702668457 +0000 UTC m=+150.670262323 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702722 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.702709338 +0000 UTC m=+150.670303244 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702761 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.702808 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.702794369 +0000 UTC m=+150.670388205 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.728212 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.728266 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.728283 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.728306 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.728354 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:41Z","lastTransitionTime":"2025-11-25T15:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.771067 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.771096 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.771123 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.771109 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.771219 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.771427 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.771531 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:41 crc kubenswrapper[4965]: E1125 15:05:41.771594 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.831872 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.831940 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.832005 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.832044 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.832071 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:41Z","lastTransitionTime":"2025-11-25T15:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.935780 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.935845 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.935872 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.935904 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:41 crc kubenswrapper[4965]: I1125 15:05:41.935924 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:41Z","lastTransitionTime":"2025-11-25T15:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.039090 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.039168 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.039190 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.039221 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.039243 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:42Z","lastTransitionTime":"2025-11-25T15:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.141727 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.141776 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.141789 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.141809 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.141823 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:42Z","lastTransitionTime":"2025-11-25T15:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.244073 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.244143 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.244166 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.244192 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.244210 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:42Z","lastTransitionTime":"2025-11-25T15:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.346942 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.347049 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.347072 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.347099 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.347140 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:42Z","lastTransitionTime":"2025-11-25T15:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.449962 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.450042 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.450061 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.450088 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.450109 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:42Z","lastTransitionTime":"2025-11-25T15:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.553489 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.553543 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.553561 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.553583 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.553602 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:42Z","lastTransitionTime":"2025-11-25T15:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.655930 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.655989 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.656002 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.656017 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.656031 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:42Z","lastTransitionTime":"2025-11-25T15:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.758439 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.758479 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.758489 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.758504 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.758515 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:42Z","lastTransitionTime":"2025-11-25T15:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.860688 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.860748 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.860770 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.860799 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.860820 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:42Z","lastTransitionTime":"2025-11-25T15:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.963493 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.963558 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.963577 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.963599 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:42 crc kubenswrapper[4965]: I1125 15:05:42.963618 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:42Z","lastTransitionTime":"2025-11-25T15:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.066903 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.066959 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.067006 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.067036 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.067058 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:43Z","lastTransitionTime":"2025-11-25T15:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.170953 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.171106 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.171129 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.171157 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.171181 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:43Z","lastTransitionTime":"2025-11-25T15:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.278736 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.278789 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.278804 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.278826 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.278841 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:43Z","lastTransitionTime":"2025-11-25T15:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.381493 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.381553 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.381571 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.381596 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.381613 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:43Z","lastTransitionTime":"2025-11-25T15:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.485094 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.485157 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.485177 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.485207 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.485226 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:43Z","lastTransitionTime":"2025-11-25T15:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.588727 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.588776 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.588794 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.588817 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.588835 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:43Z","lastTransitionTime":"2025-11-25T15:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.691411 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.691461 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.691474 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.691494 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.691510 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:43Z","lastTransitionTime":"2025-11-25T15:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.770981 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.771038 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:43 crc kubenswrapper[4965]: E1125 15:05:43.771110 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.771052 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.771052 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:43 crc kubenswrapper[4965]: E1125 15:05:43.771208 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:43 crc kubenswrapper[4965]: E1125 15:05:43.771316 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:43 crc kubenswrapper[4965]: E1125 15:05:43.771476 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.794649 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.794693 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.794704 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.794722 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.794732 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:43Z","lastTransitionTime":"2025-11-25T15:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.898250 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.898322 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.898335 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.898355 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:43 crc kubenswrapper[4965]: I1125 15:05:43.898389 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:43Z","lastTransitionTime":"2025-11-25T15:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.001156 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.001443 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.001453 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.001468 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.001478 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:44Z","lastTransitionTime":"2025-11-25T15:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.103618 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.103666 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.103675 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.103689 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.103698 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:44Z","lastTransitionTime":"2025-11-25T15:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.206590 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.206626 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.206638 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.206654 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.206665 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:44Z","lastTransitionTime":"2025-11-25T15:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.309737 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.309830 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.309849 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.309907 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.309925 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:44Z","lastTransitionTime":"2025-11-25T15:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.412156 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.412204 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.412222 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.412250 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.412268 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:44Z","lastTransitionTime":"2025-11-25T15:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.515418 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.515459 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.515470 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.515487 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.515500 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:44Z","lastTransitionTime":"2025-11-25T15:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.618757 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.618817 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.618835 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.618862 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.618881 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:44Z","lastTransitionTime":"2025-11-25T15:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.722063 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.722126 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.722144 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.722175 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.722192 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:44Z","lastTransitionTime":"2025-11-25T15:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.824743 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.824775 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.824786 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.824804 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.824814 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:44Z","lastTransitionTime":"2025-11-25T15:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.926918 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.926952 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.927009 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.927025 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:44 crc kubenswrapper[4965]: I1125 15:05:44.927036 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:44Z","lastTransitionTime":"2025-11-25T15:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.029149 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.029184 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.029198 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.029214 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.029224 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.132205 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.132246 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.132260 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.132282 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.132298 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.235144 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.235294 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.235315 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.235341 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.235359 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.338315 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.338625 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.338645 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.338670 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.338687 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.383297 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.383335 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.383345 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.383357 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.383366 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: E1125 15:05:45.403653 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.410557 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.410610 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.410625 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.410644 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.410659 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: E1125 15:05:45.429564 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.434829 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.434878 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.434894 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.434913 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.434929 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: E1125 15:05:45.452507 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.456348 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.456380 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.456388 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.456402 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.456414 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: E1125 15:05:45.471107 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.475523 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.475557 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.475570 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.475586 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.475598 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: E1125 15:05:45.488517 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:45 crc kubenswrapper[4965]: E1125 15:05:45.488639 4965 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.496437 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.496488 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.496500 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.496519 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.496544 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.598590 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.598632 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.598643 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.598660 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.598671 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.701331 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.701513 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.701545 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.701573 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.701594 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.771207 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.771224 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.771287 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.771319 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:45 crc kubenswrapper[4965]: E1125 15:05:45.771489 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:45 crc kubenswrapper[4965]: E1125 15:05:45.771611 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:45 crc kubenswrapper[4965]: E1125 15:05:45.771751 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:45 crc kubenswrapper[4965]: E1125 15:05:45.771907 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.805098 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.805173 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.805199 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.805231 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.805254 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.907605 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.907642 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.907653 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.907670 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:45 crc kubenswrapper[4965]: I1125 15:05:45.907681 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:45Z","lastTransitionTime":"2025-11-25T15:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.010853 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.010905 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.010924 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.010947 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.010996 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:46Z","lastTransitionTime":"2025-11-25T15:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.113919 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.113956 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.113987 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.114005 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.114017 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:46Z","lastTransitionTime":"2025-11-25T15:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.216791 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.216840 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.216856 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.216880 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.216900 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:46Z","lastTransitionTime":"2025-11-25T15:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.320243 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.320318 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.320342 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.320371 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.320394 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:46Z","lastTransitionTime":"2025-11-25T15:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.424252 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.424303 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.424323 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.424346 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.424364 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:46Z","lastTransitionTime":"2025-11-25T15:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.528200 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.528617 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.528820 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.529056 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.529723 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:46Z","lastTransitionTime":"2025-11-25T15:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.633374 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.633633 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.633834 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.633948 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.634084 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:46Z","lastTransitionTime":"2025-11-25T15:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.737620 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.737673 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.737688 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.737707 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.737721 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:46Z","lastTransitionTime":"2025-11-25T15:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.794137 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.811985 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.824352 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.841853 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.841942 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.842000 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.842034 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.842056 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:46Z","lastTransitionTime":"2025-11-25T15:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.843155 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:04:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad\\\\n2025-11-25T15:04:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad to /host/opt/cni/bin/\\\\n2025-11-25T15:04:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:04:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:05:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.855445 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.870380 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dacb07a-db92-4294-a7e4-1012bbe6c9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf7c955a7ce02cd5af2d3b7f447e561eea55e5104fa2eb07d863a90a82b0809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.888714 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.903505 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.927177 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.946181 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.946270 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.946290 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.946337 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.946351 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:46Z","lastTransitionTime":"2025-11-25T15:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.960781 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:38Z\\\",\\\"message\\\":\\\"ty-vrzqb after 0 failed attempt(s)\\\\nI1125 15:05:38.979282 6906 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1125 15:05:38.978750 6906 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wpjkp in node crc\\\\nI1125 15:05:38.979333 6906 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 15:05:38.979361 6906 services_controller.go:452] Built service openshift-machine-api/machine-api-operator-machine-webhook per-node LB for network=default: []services.LB{}\\\\nI1125 15:05:38.979370 6906 serv\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:46 crc kubenswrapper[4965]: I1125 15:05:46.978467 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.001199 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.017009 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.040528 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.048593 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.048624 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.048635 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.048650 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.048661 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:47Z","lastTransitionTime":"2025-11-25T15:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.056072 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.071030 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.084769 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.112638 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.134427 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:47Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.151473 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.151508 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.151519 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.151537 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.151548 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:47Z","lastTransitionTime":"2025-11-25T15:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.254784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.255150 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.255238 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.255331 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.255404 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:47Z","lastTransitionTime":"2025-11-25T15:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.358599 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.359042 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.359225 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.359390 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.359522 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:47Z","lastTransitionTime":"2025-11-25T15:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.463485 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.463544 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.463563 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.463591 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.463608 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:47Z","lastTransitionTime":"2025-11-25T15:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.566203 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.566304 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.566317 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.566342 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.566354 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:47Z","lastTransitionTime":"2025-11-25T15:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.668874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.669136 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.669213 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.669284 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.669356 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:47Z","lastTransitionTime":"2025-11-25T15:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.770728 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:47 crc kubenswrapper[4965]: E1125 15:05:47.770899 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.770765 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:47 crc kubenswrapper[4965]: E1125 15:05:47.770957 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.770807 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:47 crc kubenswrapper[4965]: E1125 15:05:47.771029 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.770723 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:47 crc kubenswrapper[4965]: E1125 15:05:47.771093 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.771897 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.771948 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.772000 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.772028 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.772049 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:47Z","lastTransitionTime":"2025-11-25T15:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.874016 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.874065 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.874077 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.874096 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.874108 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:47Z","lastTransitionTime":"2025-11-25T15:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.976392 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.976429 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.976438 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.976451 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:47 crc kubenswrapper[4965]: I1125 15:05:47.976466 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:47Z","lastTransitionTime":"2025-11-25T15:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.078833 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.078896 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.078927 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.078945 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.078956 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:48Z","lastTransitionTime":"2025-11-25T15:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.181518 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.181552 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.181563 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.181579 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.181591 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:48Z","lastTransitionTime":"2025-11-25T15:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.284292 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.284341 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.284383 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.284400 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.284412 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:48Z","lastTransitionTime":"2025-11-25T15:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.388221 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.388328 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.388350 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.388827 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.389412 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:48Z","lastTransitionTime":"2025-11-25T15:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.492341 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.492389 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.492406 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.492431 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.492448 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:48Z","lastTransitionTime":"2025-11-25T15:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.595291 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.595344 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.595369 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.595395 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.595411 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:48Z","lastTransitionTime":"2025-11-25T15:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.716535 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.716572 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.716605 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.716618 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.716628 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:48Z","lastTransitionTime":"2025-11-25T15:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.819268 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.819300 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.819308 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.819322 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.819330 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:48Z","lastTransitionTime":"2025-11-25T15:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.921998 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.922046 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.922061 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.922083 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:48 crc kubenswrapper[4965]: I1125 15:05:48.922161 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:48Z","lastTransitionTime":"2025-11-25T15:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.025196 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.025299 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.025325 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.025352 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.025372 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:49Z","lastTransitionTime":"2025-11-25T15:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.128636 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.128679 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.128690 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.128707 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.128719 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:49Z","lastTransitionTime":"2025-11-25T15:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.238886 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.240131 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.240338 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.240577 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.240794 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:49Z","lastTransitionTime":"2025-11-25T15:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.344051 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.344476 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.344645 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.344851 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.345023 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:49Z","lastTransitionTime":"2025-11-25T15:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.448379 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.448461 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.448479 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.448502 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.448519 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:49Z","lastTransitionTime":"2025-11-25T15:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.551455 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.551509 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.551526 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.551551 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.551570 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:49Z","lastTransitionTime":"2025-11-25T15:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.654597 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.654667 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.654690 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.654722 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.654815 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:49Z","lastTransitionTime":"2025-11-25T15:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.758426 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.758745 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.758827 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.758925 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.759055 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:49Z","lastTransitionTime":"2025-11-25T15:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.770829 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.770996 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.770877 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.770877 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:49 crc kubenswrapper[4965]: E1125 15:05:49.771269 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:49 crc kubenswrapper[4965]: E1125 15:05:49.771553 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:49 crc kubenswrapper[4965]: E1125 15:05:49.771718 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:49 crc kubenswrapper[4965]: E1125 15:05:49.771861 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.862737 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.862795 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.862813 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.862840 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.862862 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:49Z","lastTransitionTime":"2025-11-25T15:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.965819 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.965866 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.965878 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.965897 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:49 crc kubenswrapper[4965]: I1125 15:05:49.965913 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:49Z","lastTransitionTime":"2025-11-25T15:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.069093 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.069185 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.069199 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.069218 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.069230 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:50Z","lastTransitionTime":"2025-11-25T15:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.172026 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.172315 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.172328 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.172344 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.172353 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:50Z","lastTransitionTime":"2025-11-25T15:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.275889 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.275947 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.275996 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.276021 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.276038 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:50Z","lastTransitionTime":"2025-11-25T15:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.379638 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.379702 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.379719 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.379744 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.379764 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:50Z","lastTransitionTime":"2025-11-25T15:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.484677 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.484724 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.484736 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.484760 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.484785 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:50Z","lastTransitionTime":"2025-11-25T15:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.588721 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.588771 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.588784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.588802 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.588816 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:50Z","lastTransitionTime":"2025-11-25T15:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.692099 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.692151 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.692163 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.692180 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.692192 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:50Z","lastTransitionTime":"2025-11-25T15:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.794943 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.795032 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.795044 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.795060 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.795070 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:50Z","lastTransitionTime":"2025-11-25T15:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.900120 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.900274 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.900289 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.900368 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:50 crc kubenswrapper[4965]: I1125 15:05:50.900383 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:50Z","lastTransitionTime":"2025-11-25T15:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.004203 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.004249 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.004266 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.004288 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.004305 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:51Z","lastTransitionTime":"2025-11-25T15:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.107380 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.107423 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.107434 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.107451 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.107461 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:51Z","lastTransitionTime":"2025-11-25T15:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.209923 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.209951 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.209960 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.209992 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.210002 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:51Z","lastTransitionTime":"2025-11-25T15:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.312241 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.312329 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.312351 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.312383 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.312473 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:51Z","lastTransitionTime":"2025-11-25T15:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.415414 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.415459 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.415476 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.415498 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.415515 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:51Z","lastTransitionTime":"2025-11-25T15:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.518414 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.518458 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.518470 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.518484 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.518495 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:51Z","lastTransitionTime":"2025-11-25T15:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.620862 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.620895 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.620905 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.620920 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.620930 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:51Z","lastTransitionTime":"2025-11-25T15:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.723237 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.723271 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.723281 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.723296 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.723308 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:51Z","lastTransitionTime":"2025-11-25T15:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.771353 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:51 crc kubenswrapper[4965]: E1125 15:05:51.771484 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.771625 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:51 crc kubenswrapper[4965]: E1125 15:05:51.771666 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.771753 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:51 crc kubenswrapper[4965]: E1125 15:05:51.771791 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.771872 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:51 crc kubenswrapper[4965]: E1125 15:05:51.771910 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.776334 4965 scope.go:117] "RemoveContainer" containerID="767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1" Nov 25 15:05:51 crc kubenswrapper[4965]: E1125 15:05:51.776524 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.825469 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.825508 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.825517 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.825531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.825540 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:51Z","lastTransitionTime":"2025-11-25T15:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.928472 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.928812 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.928824 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.928840 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:51 crc kubenswrapper[4965]: I1125 15:05:51.928852 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:51Z","lastTransitionTime":"2025-11-25T15:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.033631 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.033698 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.033719 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.033782 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.033801 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:52Z","lastTransitionTime":"2025-11-25T15:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.136088 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.136116 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.136124 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.136137 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.136146 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:52Z","lastTransitionTime":"2025-11-25T15:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.238988 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.239027 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.239038 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.239054 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.239064 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:52Z","lastTransitionTime":"2025-11-25T15:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.341722 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.341785 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.341803 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.341828 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.341846 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:52Z","lastTransitionTime":"2025-11-25T15:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.444335 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.444399 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.444414 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.444430 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.444441 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:52Z","lastTransitionTime":"2025-11-25T15:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.546902 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.546948 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.546995 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.547018 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.547031 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:52Z","lastTransitionTime":"2025-11-25T15:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.650135 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.650212 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.650309 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.650386 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.650407 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:52Z","lastTransitionTime":"2025-11-25T15:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.753630 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.753716 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.753741 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.753794 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.753818 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:52Z","lastTransitionTime":"2025-11-25T15:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.857147 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.857196 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.857207 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.857224 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.857233 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:52Z","lastTransitionTime":"2025-11-25T15:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.959341 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.959407 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.959420 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.959437 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:52 crc kubenswrapper[4965]: I1125 15:05:52.959449 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:52Z","lastTransitionTime":"2025-11-25T15:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.062321 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.062368 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.062379 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.062398 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.062410 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:53Z","lastTransitionTime":"2025-11-25T15:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.164378 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.164417 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.164427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.164441 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.164451 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:53Z","lastTransitionTime":"2025-11-25T15:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.266995 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.267027 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.267036 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.267050 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.267059 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:53Z","lastTransitionTime":"2025-11-25T15:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.369343 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.369401 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.369412 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.369426 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.369437 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:53Z","lastTransitionTime":"2025-11-25T15:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.472573 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.472621 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.472637 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.472656 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.472670 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:53Z","lastTransitionTime":"2025-11-25T15:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.575855 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.575909 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.575929 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.575954 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.575998 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:53Z","lastTransitionTime":"2025-11-25T15:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.678636 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.678695 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.678710 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.678731 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.678747 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:53Z","lastTransitionTime":"2025-11-25T15:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.771012 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.771020 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.771033 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.771106 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:53 crc kubenswrapper[4965]: E1125 15:05:53.771244 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:53 crc kubenswrapper[4965]: E1125 15:05:53.771327 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:53 crc kubenswrapper[4965]: E1125 15:05:53.771412 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:53 crc kubenswrapper[4965]: E1125 15:05:53.771689 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.780989 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.781034 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.781051 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.781070 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.781087 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:53Z","lastTransitionTime":"2025-11-25T15:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.884080 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.884159 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.884179 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.884227 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.884244 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:53Z","lastTransitionTime":"2025-11-25T15:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.987550 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.987620 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.987636 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.987661 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:53 crc kubenswrapper[4965]: I1125 15:05:53.987679 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:53Z","lastTransitionTime":"2025-11-25T15:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.089729 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.089764 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.089772 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.089786 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.089796 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:54Z","lastTransitionTime":"2025-11-25T15:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.192014 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.192077 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.192093 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.192116 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.192133 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:54Z","lastTransitionTime":"2025-11-25T15:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.294406 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.294457 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.294469 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.294488 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.294500 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:54Z","lastTransitionTime":"2025-11-25T15:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.396495 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.396535 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.396544 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.396557 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.396565 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:54Z","lastTransitionTime":"2025-11-25T15:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.499583 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.499633 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.499644 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.499662 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.499673 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:54Z","lastTransitionTime":"2025-11-25T15:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.602215 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.602286 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.602308 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.602334 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.602352 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:54Z","lastTransitionTime":"2025-11-25T15:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.704930 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.704996 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.705009 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.705025 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.705037 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:54Z","lastTransitionTime":"2025-11-25T15:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.807427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.807475 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.807492 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.807514 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.807531 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:54Z","lastTransitionTime":"2025-11-25T15:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.910687 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.910744 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.910760 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.910782 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:54 crc kubenswrapper[4965]: I1125 15:05:54.910799 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:54Z","lastTransitionTime":"2025-11-25T15:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.014462 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.014531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.014553 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.014581 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.014602 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.117756 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.117863 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.117882 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.117904 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.117921 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.220275 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.220309 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.220320 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.220334 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.220345 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.324054 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.324118 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.324136 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.324161 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.324180 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.427405 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.427482 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.427514 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.427604 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.427633 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.530101 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.530139 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.530150 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.530165 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.530176 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.542586 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.542638 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.542649 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.542661 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.542670 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: E1125 15:05:55.560420 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:55Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.564334 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.564368 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.564380 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.564398 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.564412 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: E1125 15:05:55.578370 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:55Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.583776 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.583866 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.583891 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.584401 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.584677 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: E1125 15:05:55.601361 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:55Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.606271 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.606332 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.606347 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.606365 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.606382 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: E1125 15:05:55.623730 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:55Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.630476 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.630537 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.630548 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.630566 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.630580 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: E1125 15:05:55.644541 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10999e79-e7eb-46c5-b6ca-2fdc8ea3b7bf\\\",\\\"systemUUID\\\":\\\"69eb65a6-67c0-4926-88da-f3ca03c4aea4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:55Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:55 crc kubenswrapper[4965]: E1125 15:05:55.644664 4965 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.646165 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.646207 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.646218 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.646234 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.646245 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.749034 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.749104 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.749117 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.749135 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.749170 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.771494 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.771537 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.771554 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.771519 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:55 crc kubenswrapper[4965]: E1125 15:05:55.771628 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:55 crc kubenswrapper[4965]: E1125 15:05:55.771742 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:55 crc kubenswrapper[4965]: E1125 15:05:55.771800 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:55 crc kubenswrapper[4965]: E1125 15:05:55.771872 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.852111 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.852145 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.852162 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.852179 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.852190 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.954918 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.954960 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.955011 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.955028 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:55 crc kubenswrapper[4965]: I1125 15:05:55.955040 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:55Z","lastTransitionTime":"2025-11-25T15:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.057988 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.058054 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.058074 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.058127 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.058147 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:56Z","lastTransitionTime":"2025-11-25T15:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.060489 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:56 crc kubenswrapper[4965]: E1125 15:05:56.060627 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:05:56 crc kubenswrapper[4965]: E1125 15:05:56.060717 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs podName:6ed72551-610b-4f03-8a57-319ef27e27e0 nodeName:}" failed. No retries permitted until 2025-11-25 15:07:00.060692326 +0000 UTC m=+165.028286112 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs") pod "network-metrics-daemon-j87z5" (UID: "6ed72551-610b-4f03-8a57-319ef27e27e0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.161259 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.161310 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.161327 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.161351 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.161368 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:56Z","lastTransitionTime":"2025-11-25T15:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.263510 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.263578 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.263597 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.263624 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.263644 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:56Z","lastTransitionTime":"2025-11-25T15:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.367228 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.367729 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.368013 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.368223 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.368446 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:56Z","lastTransitionTime":"2025-11-25T15:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.471397 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.471451 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.471494 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.471513 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.471526 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:56Z","lastTransitionTime":"2025-11-25T15:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.574269 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.574331 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.574348 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.574371 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.574389 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:56Z","lastTransitionTime":"2025-11-25T15:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.678149 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.678218 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.678240 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.678269 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.678290 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:56Z","lastTransitionTime":"2025-11-25T15:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.783334 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.783400 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.783423 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.783449 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.783471 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:56Z","lastTransitionTime":"2025-11-25T15:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.793886 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a97f603b-f66c-42c7-9916-56c991135ede\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"file observer\\\\nW1125 15:04:37.429615 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1125 15:04:37.429746 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:04:37.430386 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-918909060/tls.crt::/tmp/serving-cert-918909060/tls.key\\\\\\\"\\\\nI1125 15:04:37.586665 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 15:04:37.589347 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 15:04:37.589367 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 15:04:37.589388 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 15:04:37.589393 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 15:04:37.595626 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1125 15:04:37.595651 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1125 15:04:37.595673 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595678 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 15:04:37.595682 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 15:04:37.595685 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 15:04:37.595687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 15:04:37.595691 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 15:04:37.596885 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.815559 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8jdpp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de2930c-eabd-4919-b214-30b0c83141f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:04:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad\\\\n2025-11-25T15:04:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_715195be-0b4f-4412-9174-a7a418fd02ad to /host/opt/cni/bin/\\\\n2025-11-25T15:04:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:04:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:05:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7p2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8jdpp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.833312 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a73fd66-1e46-4473-8508-a8cf24d51a04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://712e28be2eed90b788d67f7012072a099784a13091afbfca86901645aee5aabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58033e15edf8fb3dcd8950bfefd249fdc253c4e291d14fe986cfc3f1ab16dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rpzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9t6rj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.849892 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dacb07a-db92-4294-a7e4-1012bbe6c9a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf7c955a7ce02cd5af2d3b7f447e561eea55e5104fa2eb07d863a90a82b0809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://268ee1856e53eaef1996c8392608151cce484b2ea51ee7e2c4d8a7dd056fd165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.868745 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.885478 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d675b9e140bba41b172e1a2eb8f168705316b713dd7dbd84d83d975622ae056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.887122 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.887167 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.887186 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.887212 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.887234 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:56Z","lastTransitionTime":"2025-11-25T15:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.901607 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ab112c4-45b9-468b-aa31-93b4f3c7444d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf95f7a39fc6d22065abdfdcef8d056954d0335d50463f78df6d2f34415999e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k77wb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x42s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.924307 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eea3820a-3f97-48a7-8b49-def506fe71e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:05:38Z\\\",\\\"message\\\":\\\"ty-vrzqb after 0 failed attempt(s)\\\\nI1125 15:05:38.979282 6906 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1125 15:05:38.978750 6906 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wpjkp in node crc\\\\nI1125 15:05:38.979333 6906 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 15:05:38.979361 6906 services_controller.go:452] Built service openshift-machine-api/machine-api-operator-machine-webhook per-node LB for network=default: []services.LB{}\\\\nI1125 15:05:38.979370 6906 serv\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:05:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mjnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-58mtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.939045 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j87z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ed72551-610b-4f03-8a57-319ef27e27e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2jw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j87z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.957419 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.972171 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c82408cb68bdf4a95fa06d8eba639e5d0ff9ea1bf717346919e9a28f8e2a8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.983392 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wpjkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b74ddb-bd2c-4b2d-a70e-9271305a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9781f6f92cee745a613857a0d675d37fa8252134d62c14b29a67591598ba34ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ll7w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wpjkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.990299 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.990327 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.990337 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.990350 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:56 crc kubenswrapper[4965]: I1125 15:05:56.990360 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:56Z","lastTransitionTime":"2025-11-25T15:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.000658 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32470785-6a9e-4ab4-bd44-a585e188fa99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fdf85dde1a81aaac8249cb0940e73c1e98c52dbbc36ed88fed6f32690e146e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07295807eb716effd046b347dc6223a4de86c1a7eda5100d1ae076291060008e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a48e8cfa146500e714748049b9af835cd97785d16a5a029d91157e17974492\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://129d530a1797cc6eeeffdd6e56f719a8501a3ab4d2c7ca2c3d2367553d13dcea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec34a8dcc43a6b5b710a60c04cc74a9caeb17da7986d034c62cd907a462532a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a48ba5fdecc8c03e522300e0b2e85ffe307de87e4ab0ea7e7ef5ead3189d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50714d526a86ea2f3434d9c96af75d3205cad04f6dafe3ae24f3e7a7e6d94495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m6qbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qtwc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:56Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.018731 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b2ecd2ca883500cb93fba43cb8463e5f26f31e3fd8cb153bb88bf8d0d6212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd6ea2c6e753faa6fa01bbd6a29f9529af7e2568054be7b792940249fbcbbf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:57Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.027954 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-59czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eec84672-2d9f-4b5a-9d5d-514bf609b63d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c68effb0331f655fee028c89f014381760f9f13ff26928ee415252451deb13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fxgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-59czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:57Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.040014 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cff848-2870-492a-96aa-ad40e9c71185\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4785ec66405b307662115ef7f8024d0d71fbcf006cc4fb44696a35040a3fb14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e3d274c4aaa98a08dcea9b9a993597940b3922358fe5768546df6ed5c64b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e8531e21c7248875d9f26e6911e6056a88f4d71b4d8958a48034850b76b53a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:57Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.052327 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f029bcf6-e63a-4d3f-ab77-583355becd18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77b6d89da1fe359223702b1da62e142a2c26f6ea0d297f97285ca82a19a60521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674018ea4c629a1e6b50b39994d2896f5dcd4d84ea3f2c68d8da251c719919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612fd7d2368412a6de88a1a832f2de163d0433789adfb09f2d40fe4c7da4d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cbb74a48f59f67952f20442c8a6c2354f3a22a3b150d2b61a704e7378d97f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:57Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.071473 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4ec7185-93bd-449b-a072-62040cf01d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8c7bc3857cdf86db5d2122fd0298c10eaad0e2322779e8c1ae97baffd72a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f0616894ca996e4da2bef1799e845011f0334f394f5e803d84fdef821880ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf8e8164155386695caea7c1e439515d46935656dc91e7e2d6111be842547f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943bcbbbad8152d76f74416dd7aeb02d017ddd38556b65798846464d85c20154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f0a6659ea75c8dca7d5b555fb49dfc4f16979020f91ef87ce801f5874b839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9670f429061b2658ac6d2e3f8ca2307b56dd8050528b66820cc6492c310fe11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://269ad754197a5bdcd7af1753c1d3bc79af07fb9714384f952127ee04d55642cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://758025262a8451c82905a70ccf26cc6791261c3b969c0e3c0ae4607b3247ff49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:04:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:04:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:57Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.084049 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:04:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:05:57Z is after 2025-08-24T17:21:41Z" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.092180 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.092226 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.092270 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.092297 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.092318 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:57Z","lastTransitionTime":"2025-11-25T15:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.195860 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.195923 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.195932 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.195946 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.195954 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:57Z","lastTransitionTime":"2025-11-25T15:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.299235 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.299295 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.299312 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.299336 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.299353 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:57Z","lastTransitionTime":"2025-11-25T15:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.402358 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.402579 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.402743 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.402893 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.403089 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:57Z","lastTransitionTime":"2025-11-25T15:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.506919 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.507021 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.507035 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.507054 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.507069 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:57Z","lastTransitionTime":"2025-11-25T15:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.612212 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.612310 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.612336 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.612373 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.612396 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:57Z","lastTransitionTime":"2025-11-25T15:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.715792 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.715858 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.715877 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.715904 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.715921 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:57Z","lastTransitionTime":"2025-11-25T15:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.771433 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.771686 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.771742 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.771798 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:57 crc kubenswrapper[4965]: E1125 15:05:57.772023 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:57 crc kubenswrapper[4965]: E1125 15:05:57.772249 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:57 crc kubenswrapper[4965]: E1125 15:05:57.772758 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:57 crc kubenswrapper[4965]: E1125 15:05:57.772853 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.818671 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.818740 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.818759 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.818784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.818804 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:57Z","lastTransitionTime":"2025-11-25T15:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.922232 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.922290 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.922305 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.922328 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:57 crc kubenswrapper[4965]: I1125 15:05:57.922343 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:57Z","lastTransitionTime":"2025-11-25T15:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.026668 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.027139 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.027309 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.027505 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.027657 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:58Z","lastTransitionTime":"2025-11-25T15:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.131534 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.131605 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.131622 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.131652 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.131672 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:58Z","lastTransitionTime":"2025-11-25T15:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.235402 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.235454 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.235473 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.235499 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.235517 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:58Z","lastTransitionTime":"2025-11-25T15:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.338605 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.338680 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.338701 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.338735 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.338757 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:58Z","lastTransitionTime":"2025-11-25T15:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.443644 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.444328 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.444509 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.444675 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.444853 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:58Z","lastTransitionTime":"2025-11-25T15:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.550191 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.551189 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.551228 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.551257 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.551274 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:58Z","lastTransitionTime":"2025-11-25T15:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.654717 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.654782 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.654800 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.654828 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.654845 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:58Z","lastTransitionTime":"2025-11-25T15:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.757673 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.757732 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.757749 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.757773 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.757790 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:58Z","lastTransitionTime":"2025-11-25T15:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.860794 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.860852 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.860866 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.860888 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.860900 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:58Z","lastTransitionTime":"2025-11-25T15:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.964876 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.964928 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.964946 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.965002 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:58 crc kubenswrapper[4965]: I1125 15:05:58.965020 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:58Z","lastTransitionTime":"2025-11-25T15:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.069108 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.069172 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.069191 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.069216 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.069234 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:59Z","lastTransitionTime":"2025-11-25T15:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.172475 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.172543 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.172561 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.172588 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.172606 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:59Z","lastTransitionTime":"2025-11-25T15:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.276035 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.276100 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.276116 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.276143 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.276163 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:59Z","lastTransitionTime":"2025-11-25T15:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.378879 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.378932 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.378947 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.379001 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.379022 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:59Z","lastTransitionTime":"2025-11-25T15:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.481949 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.482046 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.482064 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.482088 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.482106 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:59Z","lastTransitionTime":"2025-11-25T15:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.585239 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.585334 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.585361 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.585390 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.585414 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:59Z","lastTransitionTime":"2025-11-25T15:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.688804 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.688863 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.688875 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.688895 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.688909 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:59Z","lastTransitionTime":"2025-11-25T15:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.771591 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.771666 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.771673 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.771673 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:05:59 crc kubenswrapper[4965]: E1125 15:05:59.771803 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:05:59 crc kubenswrapper[4965]: E1125 15:05:59.772103 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:05:59 crc kubenswrapper[4965]: E1125 15:05:59.772128 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:05:59 crc kubenswrapper[4965]: E1125 15:05:59.772211 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.793172 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.793223 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.793232 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.793251 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.793264 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:59Z","lastTransitionTime":"2025-11-25T15:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.900948 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.901025 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.901044 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.901070 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:05:59 crc kubenswrapper[4965]: I1125 15:05:59.901089 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:05:59Z","lastTransitionTime":"2025-11-25T15:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.004322 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.004373 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.004391 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.004417 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.004435 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:00Z","lastTransitionTime":"2025-11-25T15:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.107254 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.107321 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.107337 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.107369 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.107387 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:00Z","lastTransitionTime":"2025-11-25T15:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.210800 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.210871 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.210895 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.210923 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.210939 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:00Z","lastTransitionTime":"2025-11-25T15:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.315654 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.315704 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.315719 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.315739 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.315754 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:00Z","lastTransitionTime":"2025-11-25T15:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.418182 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.418244 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.418266 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.418293 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.418313 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:00Z","lastTransitionTime":"2025-11-25T15:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.520890 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.520940 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.521000 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.521031 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.521053 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:00Z","lastTransitionTime":"2025-11-25T15:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.624271 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.624331 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.624353 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.624380 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.624402 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:00Z","lastTransitionTime":"2025-11-25T15:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.726944 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.727068 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.727185 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.727282 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.727356 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:00Z","lastTransitionTime":"2025-11-25T15:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.830477 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.830526 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.830540 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.830566 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.830580 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:00Z","lastTransitionTime":"2025-11-25T15:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.933707 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.933757 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.933768 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.933788 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:00 crc kubenswrapper[4965]: I1125 15:06:00.933804 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:00Z","lastTransitionTime":"2025-11-25T15:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.037604 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.037671 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.037692 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.037716 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.037732 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:01Z","lastTransitionTime":"2025-11-25T15:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.141445 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.141517 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.141536 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.141563 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.141582 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:01Z","lastTransitionTime":"2025-11-25T15:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.245225 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.245288 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.245305 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.245328 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.245345 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:01Z","lastTransitionTime":"2025-11-25T15:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.347996 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.348057 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.348075 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.348101 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.348117 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:01Z","lastTransitionTime":"2025-11-25T15:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.451248 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.451304 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.451330 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.451360 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.451383 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:01Z","lastTransitionTime":"2025-11-25T15:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.554511 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.554574 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.554591 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.554617 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.554635 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:01Z","lastTransitionTime":"2025-11-25T15:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.657078 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.657131 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.657147 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.657170 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.657187 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:01Z","lastTransitionTime":"2025-11-25T15:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.759703 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.759758 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.759774 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.759798 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.759817 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:01Z","lastTransitionTime":"2025-11-25T15:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.771269 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.771294 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.771474 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.771588 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:01 crc kubenswrapper[4965]: E1125 15:06:01.771821 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:01 crc kubenswrapper[4965]: E1125 15:06:01.771942 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:01 crc kubenswrapper[4965]: E1125 15:06:01.772136 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:01 crc kubenswrapper[4965]: E1125 15:06:01.771623 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.862870 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.862919 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.862935 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.862956 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.863002 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:01Z","lastTransitionTime":"2025-11-25T15:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.966496 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.966562 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.966579 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.966603 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:01 crc kubenswrapper[4965]: I1125 15:06:01.966619 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:01Z","lastTransitionTime":"2025-11-25T15:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.069832 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.069893 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.069911 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.069937 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.069955 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:02Z","lastTransitionTime":"2025-11-25T15:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.172800 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.172865 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.172888 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.173023 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.173064 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:02Z","lastTransitionTime":"2025-11-25T15:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.275608 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.275668 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.275691 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.275715 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.275732 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:02Z","lastTransitionTime":"2025-11-25T15:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.378695 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.378772 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.378798 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.378913 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.378940 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:02Z","lastTransitionTime":"2025-11-25T15:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.481941 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.481988 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.481998 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.482015 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.482025 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:02Z","lastTransitionTime":"2025-11-25T15:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.587182 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.587462 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.587544 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.587653 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.587727 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:02Z","lastTransitionTime":"2025-11-25T15:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.691733 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.691816 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.691837 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.691866 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.691887 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:02Z","lastTransitionTime":"2025-11-25T15:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.772555 4965 scope.go:117] "RemoveContainer" containerID="767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1" Nov 25 15:06:02 crc kubenswrapper[4965]: E1125 15:06:02.772754 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.795490 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.795534 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.795545 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.795563 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.795575 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:02Z","lastTransitionTime":"2025-11-25T15:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.899222 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.899619 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.899803 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.899832 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:02 crc kubenswrapper[4965]: I1125 15:06:02.899849 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:02Z","lastTransitionTime":"2025-11-25T15:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.002480 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.002542 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.002563 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.002587 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.002604 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:03Z","lastTransitionTime":"2025-11-25T15:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.105671 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.105719 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.105736 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.105760 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.105779 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:03Z","lastTransitionTime":"2025-11-25T15:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.209176 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.209226 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.209245 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.209268 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.209284 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:03Z","lastTransitionTime":"2025-11-25T15:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.312576 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.312674 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.312805 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.312874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.312895 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:03Z","lastTransitionTime":"2025-11-25T15:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.415467 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.415505 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.415514 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.415528 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.415537 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:03Z","lastTransitionTime":"2025-11-25T15:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.518114 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.518157 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.518196 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.518215 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.518249 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:03Z","lastTransitionTime":"2025-11-25T15:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.621819 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.621887 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.621906 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.621934 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.621953 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:03Z","lastTransitionTime":"2025-11-25T15:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.724788 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.724826 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.724836 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.724852 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.724862 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:03Z","lastTransitionTime":"2025-11-25T15:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.770742 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.770852 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.770902 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.770911 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:03 crc kubenswrapper[4965]: E1125 15:06:03.771144 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:03 crc kubenswrapper[4965]: E1125 15:06:03.771320 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:03 crc kubenswrapper[4965]: E1125 15:06:03.771531 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:03 crc kubenswrapper[4965]: E1125 15:06:03.771836 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.828162 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.828214 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.828231 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.828298 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.828317 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:03Z","lastTransitionTime":"2025-11-25T15:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.931877 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.931952 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.932011 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.932042 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:03 crc kubenswrapper[4965]: I1125 15:06:03.932064 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:03Z","lastTransitionTime":"2025-11-25T15:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.035096 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.035170 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.035187 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.035213 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.035230 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:04Z","lastTransitionTime":"2025-11-25T15:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.138395 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.138451 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.138461 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.138476 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.138486 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:04Z","lastTransitionTime":"2025-11-25T15:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.241392 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.241460 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.241479 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.241502 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.241540 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:04Z","lastTransitionTime":"2025-11-25T15:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.344794 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.344869 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.344889 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.344913 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.344931 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:04Z","lastTransitionTime":"2025-11-25T15:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.448193 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.448261 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.448284 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.448314 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.448337 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:04Z","lastTransitionTime":"2025-11-25T15:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.551772 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.551835 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.551854 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.551878 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.551895 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:04Z","lastTransitionTime":"2025-11-25T15:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.654180 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.654246 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.654265 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.654287 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.654306 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:04Z","lastTransitionTime":"2025-11-25T15:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.756870 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.757130 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.757149 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.757170 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.757182 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:04Z","lastTransitionTime":"2025-11-25T15:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.859811 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.859842 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.859851 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.859867 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.859879 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:04Z","lastTransitionTime":"2025-11-25T15:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.962441 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.962506 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.962532 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.962561 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:04 crc kubenswrapper[4965]: I1125 15:06:04.962588 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:04Z","lastTransitionTime":"2025-11-25T15:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.065453 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.065504 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.065516 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.065533 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.065548 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:05Z","lastTransitionTime":"2025-11-25T15:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.168258 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.168293 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.168301 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.168316 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.168329 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:05Z","lastTransitionTime":"2025-11-25T15:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.271248 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.271307 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.271325 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.271350 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.271369 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:05Z","lastTransitionTime":"2025-11-25T15:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.374465 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.374507 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.374520 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.374538 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.374550 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:05Z","lastTransitionTime":"2025-11-25T15:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.477730 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.477769 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.477780 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.477824 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.477838 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:05Z","lastTransitionTime":"2025-11-25T15:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.580770 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.580812 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.580823 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.580838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.580849 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:05Z","lastTransitionTime":"2025-11-25T15:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.684433 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.684472 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.684480 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.684510 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.684522 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:05Z","lastTransitionTime":"2025-11-25T15:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.771203 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.771235 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.771200 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.771215 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:05 crc kubenswrapper[4965]: E1125 15:06:05.771405 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:05 crc kubenswrapper[4965]: E1125 15:06:05.771587 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:05 crc kubenswrapper[4965]: E1125 15:06:05.771744 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:05 crc kubenswrapper[4965]: E1125 15:06:05.771813 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.786790 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.786854 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.786875 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.786904 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.786930 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:05Z","lastTransitionTime":"2025-11-25T15:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.889544 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.889589 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.889606 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.889629 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.889646 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:05Z","lastTransitionTime":"2025-11-25T15:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.902479 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.902513 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.902524 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.902540 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.902557 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:06:05Z","lastTransitionTime":"2025-11-25T15:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.965310 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m"] Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.965748 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.970196 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.970514 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.970570 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.970763 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 25 15:06:05 crc kubenswrapper[4965]: I1125 15:06:05.989661 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=38.98963401 podStartE2EDuration="38.98963401s" podCreationTimestamp="2025-11-25 15:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:05.989444874 +0000 UTC m=+110.957038640" watchObservedRunningTime="2025-11-25 15:06:05.98963401 +0000 UTC m=+110.957227796" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.068786 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podStartSLOduration=89.068767316 podStartE2EDuration="1m29.068767316s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:06.049443532 +0000 UTC m=+111.017037278" watchObservedRunningTime="2025-11-25 15:06:06.068767316 +0000 UTC m=+111.036361062" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.078964 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/03da3f4d-4397-4341-adcf-aac8c4c9e237-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.079096 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03da3f4d-4397-4341-adcf-aac8c4c9e237-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.079168 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03da3f4d-4397-4341-adcf-aac8c4c9e237-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.079244 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/03da3f4d-4397-4341-adcf-aac8c4c9e237-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.079345 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03da3f4d-4397-4341-adcf-aac8c4c9e237-service-ca\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.095893 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8jdpp" podStartSLOduration=89.09586898 podStartE2EDuration="1m29.09586898s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:06.069275879 +0000 UTC m=+111.036869635" watchObservedRunningTime="2025-11-25 15:06:06.09586898 +0000 UTC m=+111.063462736" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.114673 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9t6rj" podStartSLOduration=89.114641929 podStartE2EDuration="1m29.114641929s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:06.097728501 +0000 UTC m=+111.065322247" watchObservedRunningTime="2025-11-25 15:06:06.114641929 +0000 UTC m=+111.082235725" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.148521 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wpjkp" podStartSLOduration=89.148499457 podStartE2EDuration="1m29.148499457s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:06.146861233 +0000 UTC m=+111.114454999" watchObservedRunningTime="2025-11-25 15:06:06.148499457 +0000 UTC m=+111.116093213" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.163463 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qtwc9" podStartSLOduration=89.163444003 podStartE2EDuration="1m29.163444003s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:06.163358691 +0000 UTC m=+111.130952447" watchObservedRunningTime="2025-11-25 15:06:06.163444003 +0000 UTC m=+111.131037749" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.180584 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/03da3f4d-4397-4341-adcf-aac8c4c9e237-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.180637 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03da3f4d-4397-4341-adcf-aac8c4c9e237-service-ca\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.180669 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/03da3f4d-4397-4341-adcf-aac8c4c9e237-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.180684 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03da3f4d-4397-4341-adcf-aac8c4c9e237-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.180708 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03da3f4d-4397-4341-adcf-aac8c4c9e237-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.180770 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/03da3f4d-4397-4341-adcf-aac8c4c9e237-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.180803 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/03da3f4d-4397-4341-adcf-aac8c4c9e237-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.181858 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03da3f4d-4397-4341-adcf-aac8c4c9e237-service-ca\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.200937 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03da3f4d-4397-4341-adcf-aac8c4c9e237-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.202942 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03da3f4d-4397-4341-adcf-aac8c4c9e237-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-h989m\" (UID: \"03da3f4d-4397-4341-adcf-aac8c4c9e237\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.226218 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=88.226195784 podStartE2EDuration="1m28.226195784s" podCreationTimestamp="2025-11-25 15:04:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:06.221757974 +0000 UTC m=+111.189351730" watchObservedRunningTime="2025-11-25 15:06:06.226195784 +0000 UTC m=+111.193789530" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.237561 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=59.237540132 podStartE2EDuration="59.237540132s" podCreationTimestamp="2025-11-25 15:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:06.237110281 +0000 UTC m=+111.204704027" watchObservedRunningTime="2025-11-25 15:06:06.237540132 +0000 UTC m=+111.205133878" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.262140 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=86.262121719 podStartE2EDuration="1m26.262121719s" podCreationTimestamp="2025-11-25 15:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:06.262060117 +0000 UTC m=+111.229653873" watchObservedRunningTime="2025-11-25 15:06:06.262121719 +0000 UTC m=+111.229715465" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.287440 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.308059 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-59czm" podStartSLOduration=89.308043624 podStartE2EDuration="1m29.308043624s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:06.307106289 +0000 UTC m=+111.274700035" watchObservedRunningTime="2025-11-25 15:06:06.308043624 +0000 UTC m=+111.275637370" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.322409 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.322390953 podStartE2EDuration="1m29.322390953s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:06.32193563 +0000 UTC m=+111.289529376" watchObservedRunningTime="2025-11-25 15:06:06.322390953 +0000 UTC m=+111.289984699" Nov 25 15:06:06 crc kubenswrapper[4965]: I1125 15:06:06.360543 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" event={"ID":"03da3f4d-4397-4341-adcf-aac8c4c9e237","Type":"ContainerStarted","Data":"22f385e390dd7cf93bc512fe5691ad169e5a893295439bab098aa878610eb387"} Nov 25 15:06:07 crc kubenswrapper[4965]: I1125 15:06:07.366015 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" event={"ID":"03da3f4d-4397-4341-adcf-aac8c4c9e237","Type":"ContainerStarted","Data":"044936e819deb8208437c084391bbbcbc3963713b29d82576105db8ac2d41be5"} Nov 25 15:06:07 crc kubenswrapper[4965]: I1125 15:06:07.771251 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:07 crc kubenswrapper[4965]: I1125 15:06:07.771319 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:07 crc kubenswrapper[4965]: I1125 15:06:07.771369 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:07 crc kubenswrapper[4965]: E1125 15:06:07.771443 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:07 crc kubenswrapper[4965]: E1125 15:06:07.771611 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:07 crc kubenswrapper[4965]: E1125 15:06:07.771711 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:07 crc kubenswrapper[4965]: I1125 15:06:07.772461 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:07 crc kubenswrapper[4965]: E1125 15:06:07.772751 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:09 crc kubenswrapper[4965]: I1125 15:06:09.771297 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:09 crc kubenswrapper[4965]: E1125 15:06:09.771726 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:09 crc kubenswrapper[4965]: I1125 15:06:09.771435 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:09 crc kubenswrapper[4965]: E1125 15:06:09.771811 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:09 crc kubenswrapper[4965]: I1125 15:06:09.771465 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:09 crc kubenswrapper[4965]: E1125 15:06:09.771860 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:09 crc kubenswrapper[4965]: I1125 15:06:09.771386 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:09 crc kubenswrapper[4965]: E1125 15:06:09.771910 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:11 crc kubenswrapper[4965]: I1125 15:06:11.770657 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:11 crc kubenswrapper[4965]: I1125 15:06:11.770711 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:11 crc kubenswrapper[4965]: I1125 15:06:11.770795 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:11 crc kubenswrapper[4965]: E1125 15:06:11.770798 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:11 crc kubenswrapper[4965]: I1125 15:06:11.770664 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:11 crc kubenswrapper[4965]: E1125 15:06:11.771046 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:11 crc kubenswrapper[4965]: E1125 15:06:11.771168 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:11 crc kubenswrapper[4965]: E1125 15:06:11.771292 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:12 crc kubenswrapper[4965]: I1125 15:06:12.385551 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8jdpp_7de2930c-eabd-4919-b214-30b0c83141f7/kube-multus/1.log" Nov 25 15:06:12 crc kubenswrapper[4965]: I1125 15:06:12.386104 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8jdpp_7de2930c-eabd-4919-b214-30b0c83141f7/kube-multus/0.log" Nov 25 15:06:12 crc kubenswrapper[4965]: I1125 15:06:12.386166 4965 generic.go:334] "Generic (PLEG): container finished" podID="7de2930c-eabd-4919-b214-30b0c83141f7" containerID="837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca" exitCode=1 Nov 25 15:06:12 crc kubenswrapper[4965]: I1125 15:06:12.386203 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8jdpp" event={"ID":"7de2930c-eabd-4919-b214-30b0c83141f7","Type":"ContainerDied","Data":"837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca"} Nov 25 15:06:12 crc kubenswrapper[4965]: I1125 15:06:12.386248 4965 scope.go:117] "RemoveContainer" containerID="af425d750ca8bb5a2e50c24b64d91e497aebb3223225a3e3db39575c4dc79a99" Nov 25 15:06:12 crc kubenswrapper[4965]: I1125 15:06:12.387049 4965 scope.go:117] "RemoveContainer" containerID="837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca" Nov 25 15:06:12 crc kubenswrapper[4965]: E1125 15:06:12.387429 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8jdpp_openshift-multus(7de2930c-eabd-4919-b214-30b0c83141f7)\"" pod="openshift-multus/multus-8jdpp" podUID="7de2930c-eabd-4919-b214-30b0c83141f7" Nov 25 15:06:12 crc kubenswrapper[4965]: I1125 15:06:12.407674 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h989m" podStartSLOduration=95.407655458 podStartE2EDuration="1m35.407655458s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:07.395848591 +0000 UTC m=+112.363442367" watchObservedRunningTime="2025-11-25 15:06:12.407655458 +0000 UTC m=+117.375249214" Nov 25 15:06:13 crc kubenswrapper[4965]: I1125 15:06:13.391532 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8jdpp_7de2930c-eabd-4919-b214-30b0c83141f7/kube-multus/1.log" Nov 25 15:06:13 crc kubenswrapper[4965]: I1125 15:06:13.770762 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:13 crc kubenswrapper[4965]: I1125 15:06:13.770772 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:13 crc kubenswrapper[4965]: I1125 15:06:13.770772 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:13 crc kubenswrapper[4965]: I1125 15:06:13.770803 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:13 crc kubenswrapper[4965]: E1125 15:06:13.771408 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:13 crc kubenswrapper[4965]: E1125 15:06:13.771281 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:13 crc kubenswrapper[4965]: E1125 15:06:13.771440 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:13 crc kubenswrapper[4965]: E1125 15:06:13.771267 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:15 crc kubenswrapper[4965]: I1125 15:06:15.771355 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:15 crc kubenswrapper[4965]: I1125 15:06:15.771493 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:15 crc kubenswrapper[4965]: E1125 15:06:15.771576 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:15 crc kubenswrapper[4965]: I1125 15:06:15.771355 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:15 crc kubenswrapper[4965]: E1125 15:06:15.771674 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:15 crc kubenswrapper[4965]: I1125 15:06:15.771373 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:15 crc kubenswrapper[4965]: E1125 15:06:15.771753 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:15 crc kubenswrapper[4965]: E1125 15:06:15.771905 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:16 crc kubenswrapper[4965]: E1125 15:06:16.723392 4965 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 25 15:06:16 crc kubenswrapper[4965]: I1125 15:06:16.772751 4965 scope.go:117] "RemoveContainer" containerID="767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1" Nov 25 15:06:16 crc kubenswrapper[4965]: E1125 15:06:16.772997 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-58mtl_openshift-ovn-kubernetes(eea3820a-3f97-48a7-8b49-def506fe71e2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" Nov 25 15:06:17 crc kubenswrapper[4965]: E1125 15:06:17.098093 4965 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 15:06:17 crc kubenswrapper[4965]: I1125 15:06:17.770886 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:17 crc kubenswrapper[4965]: I1125 15:06:17.770869 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:17 crc kubenswrapper[4965]: I1125 15:06:17.770958 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:17 crc kubenswrapper[4965]: I1125 15:06:17.771098 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:17 crc kubenswrapper[4965]: E1125 15:06:17.771286 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:17 crc kubenswrapper[4965]: E1125 15:06:17.771605 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:17 crc kubenswrapper[4965]: E1125 15:06:17.771788 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:17 crc kubenswrapper[4965]: E1125 15:06:17.771838 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:19 crc kubenswrapper[4965]: I1125 15:06:19.771543 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:19 crc kubenswrapper[4965]: I1125 15:06:19.771620 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:19 crc kubenswrapper[4965]: E1125 15:06:19.771751 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:19 crc kubenswrapper[4965]: I1125 15:06:19.771561 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:19 crc kubenswrapper[4965]: E1125 15:06:19.772001 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:19 crc kubenswrapper[4965]: I1125 15:06:19.772093 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:19 crc kubenswrapper[4965]: E1125 15:06:19.772185 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:19 crc kubenswrapper[4965]: E1125 15:06:19.772264 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:21 crc kubenswrapper[4965]: I1125 15:06:21.771123 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:21 crc kubenswrapper[4965]: E1125 15:06:21.771282 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:21 crc kubenswrapper[4965]: I1125 15:06:21.771444 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:21 crc kubenswrapper[4965]: I1125 15:06:21.771456 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:21 crc kubenswrapper[4965]: E1125 15:06:21.771503 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:21 crc kubenswrapper[4965]: E1125 15:06:21.771612 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:21 crc kubenswrapper[4965]: I1125 15:06:21.772127 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:21 crc kubenswrapper[4965]: E1125 15:06:21.772201 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:22 crc kubenswrapper[4965]: E1125 15:06:22.099440 4965 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 15:06:23 crc kubenswrapper[4965]: I1125 15:06:23.771372 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:23 crc kubenswrapper[4965]: E1125 15:06:23.771599 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:23 crc kubenswrapper[4965]: I1125 15:06:23.771372 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:23 crc kubenswrapper[4965]: E1125 15:06:23.771724 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:23 crc kubenswrapper[4965]: I1125 15:06:23.771421 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:23 crc kubenswrapper[4965]: E1125 15:06:23.771808 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:23 crc kubenswrapper[4965]: I1125 15:06:23.772177 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:23 crc kubenswrapper[4965]: E1125 15:06:23.772338 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:24 crc kubenswrapper[4965]: I1125 15:06:24.776068 4965 scope.go:117] "RemoveContainer" containerID="837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca" Nov 25 15:06:25 crc kubenswrapper[4965]: I1125 15:06:25.433251 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8jdpp_7de2930c-eabd-4919-b214-30b0c83141f7/kube-multus/1.log" Nov 25 15:06:25 crc kubenswrapper[4965]: I1125 15:06:25.433304 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8jdpp" event={"ID":"7de2930c-eabd-4919-b214-30b0c83141f7","Type":"ContainerStarted","Data":"c96544dec6c115d2b40555ed7271e0566eeeb05c3c57d0c0534d8bcd6583458f"} Nov 25 15:06:25 crc kubenswrapper[4965]: I1125 15:06:25.771252 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:25 crc kubenswrapper[4965]: I1125 15:06:25.771357 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:25 crc kubenswrapper[4965]: E1125 15:06:25.771432 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:25 crc kubenswrapper[4965]: E1125 15:06:25.771501 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:25 crc kubenswrapper[4965]: I1125 15:06:25.771555 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:25 crc kubenswrapper[4965]: E1125 15:06:25.771603 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:25 crc kubenswrapper[4965]: I1125 15:06:25.771639 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:25 crc kubenswrapper[4965]: E1125 15:06:25.771693 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:27 crc kubenswrapper[4965]: E1125 15:06:27.099888 4965 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 15:06:27 crc kubenswrapper[4965]: I1125 15:06:27.770630 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:27 crc kubenswrapper[4965]: I1125 15:06:27.770699 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:27 crc kubenswrapper[4965]: E1125 15:06:27.770822 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:27 crc kubenswrapper[4965]: I1125 15:06:27.770905 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:27 crc kubenswrapper[4965]: I1125 15:06:27.771152 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:27 crc kubenswrapper[4965]: E1125 15:06:27.771094 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:27 crc kubenswrapper[4965]: E1125 15:06:27.771895 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:27 crc kubenswrapper[4965]: E1125 15:06:27.771742 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:27 crc kubenswrapper[4965]: I1125 15:06:27.772302 4965 scope.go:117] "RemoveContainer" containerID="767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1" Nov 25 15:06:28 crc kubenswrapper[4965]: I1125 15:06:28.444063 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/3.log" Nov 25 15:06:28 crc kubenswrapper[4965]: I1125 15:06:28.446735 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerStarted","Data":"edf8a7dc4425022f798e716d332bb0b3e616154dad6e2f975ceed2665e9bcaa0"} Nov 25 15:06:28 crc kubenswrapper[4965]: I1125 15:06:28.447273 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:06:28 crc kubenswrapper[4965]: I1125 15:06:28.585064 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podStartSLOduration=111.585045594 podStartE2EDuration="1m51.585045594s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:28.479263287 +0000 UTC m=+133.446857033" watchObservedRunningTime="2025-11-25 15:06:28.585045594 +0000 UTC m=+133.552639340" Nov 25 15:06:28 crc kubenswrapper[4965]: I1125 15:06:28.585253 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j87z5"] Nov 25 15:06:28 crc kubenswrapper[4965]: I1125 15:06:28.585327 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:28 crc kubenswrapper[4965]: E1125 15:06:28.585405 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:29 crc kubenswrapper[4965]: I1125 15:06:29.771078 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:29 crc kubenswrapper[4965]: I1125 15:06:29.771115 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:29 crc kubenswrapper[4965]: E1125 15:06:29.771531 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:29 crc kubenswrapper[4965]: E1125 15:06:29.771582 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:29 crc kubenswrapper[4965]: I1125 15:06:29.771893 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:29 crc kubenswrapper[4965]: E1125 15:06:29.772062 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:30 crc kubenswrapper[4965]: I1125 15:06:30.946306 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:30 crc kubenswrapper[4965]: E1125 15:06:30.946883 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:06:30 crc kubenswrapper[4965]: I1125 15:06:30.946747 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:30 crc kubenswrapper[4965]: E1125 15:06:30.947206 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j87z5" podUID="6ed72551-610b-4f03-8a57-319ef27e27e0" Nov 25 15:06:31 crc kubenswrapper[4965]: I1125 15:06:31.771126 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:31 crc kubenswrapper[4965]: E1125 15:06:31.771252 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:06:31 crc kubenswrapper[4965]: I1125 15:06:31.771127 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:31 crc kubenswrapper[4965]: E1125 15:06:31.771405 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:06:32 crc kubenswrapper[4965]: I1125 15:06:32.771093 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:32 crc kubenswrapper[4965]: I1125 15:06:32.771094 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:06:32 crc kubenswrapper[4965]: I1125 15:06:32.773538 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 25 15:06:32 crc kubenswrapper[4965]: I1125 15:06:32.773845 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 25 15:06:32 crc kubenswrapper[4965]: I1125 15:06:32.774165 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 25 15:06:32 crc kubenswrapper[4965]: I1125 15:06:32.775592 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 25 15:06:33 crc kubenswrapper[4965]: I1125 15:06:33.771210 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:33 crc kubenswrapper[4965]: I1125 15:06:33.771212 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:33 crc kubenswrapper[4965]: I1125 15:06:33.774392 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 25 15:06:33 crc kubenswrapper[4965]: I1125 15:06:33.775483 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.664363 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.702445 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rgvb"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.702952 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.703509 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b6gl2"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.703989 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.705032 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.708480 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.708539 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.708898 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.709592 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.709321 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.709372 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.709438 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.712706 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.713057 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.713064 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.713348 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.717994 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.718358 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.720112 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qvn4k"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.720522 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.723397 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.724084 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.724998 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.725248 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.725387 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.725389 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.725620 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.726003 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.726300 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.728865 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.728916 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.728867 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.731207 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.731736 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.733434 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.733510 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.733555 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.733594 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.733780 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.733797 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.733856 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.733875 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.734011 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.734075 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.734475 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.734589 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.734739 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.735050 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.735225 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.735760 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.735772 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.736276 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.738543 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-pdprk"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.738819 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.749144 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.749283 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.749323 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.749546 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.750289 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.753275 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.772309 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.772525 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.772569 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.772882 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.772989 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773069 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773133 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773213 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773260 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773289 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773345 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773422 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773516 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773588 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773658 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.773863 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.774017 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.774090 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.774323 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.775005 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.775300 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.777135 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.778517 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.779274 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l75ns"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.779609 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.779707 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.779857 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.779930 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fjgbs"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.780168 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5mpvj"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.780265 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.780391 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-67kvn"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.780493 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.780632 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.780719 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.781081 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jv944"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.781357 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.781421 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.781445 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.781722 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.782109 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.782276 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.782367 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7hnpl"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.782574 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.782632 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.786979 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.787128 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.787218 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.787128 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.787348 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.787380 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.787611 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2pf5t"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.787883 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2pf5t" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.788084 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.789004 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-82czk"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.789610 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.790180 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.791119 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.792097 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.792191 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.792392 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.792612 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.792741 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.792848 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.792874 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.792885 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.793012 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.793015 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.793039 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.793364 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nld56"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.793641 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.806446 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.808104 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.808627 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.813988 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.838311 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.838818 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.838910 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.839385 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.839742 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.839872 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.839960 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.840058 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.840249 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.840314 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.840384 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.840623 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.840836 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.841071 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.842747 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22g9m"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.843277 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lcspp"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.843554 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.843683 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.843919 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.844238 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.844535 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.845155 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.851331 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.851919 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.849686 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.853519 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.853820 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.854462 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.856458 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857006 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09e6c287-3ef1-4ea0-87c9-54b59acfc772-service-ca-bundle\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857047 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390-signing-key\") pod \"service-ca-9c57cc56f-7hnpl\" (UID: \"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390\") " pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857075 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgbdl\" (UniqueName: \"kubernetes.io/projected/77477a76-df54-4755-89b0-9b2ec40e098d-kube-api-access-lgbdl\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857103 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-auth-proxy-config\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857125 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d55c18b9-bb80-428c-95ea-f21c6b0694e4-encryption-config\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857145 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d55c18b9-bb80-428c-95ea-f21c6b0694e4-audit-dir\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857165 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-config\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857187 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/317839b7-786d-4e93-8b37-4dd23e4a5032-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857210 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9953c224-6f16-4a0d-ad9a-3ea1e4914499-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g9mfz\" (UID: \"9953c224-6f16-4a0d-ad9a-3ea1e4914499\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857231 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-config\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857250 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-dir\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.857271 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/317839b7-786d-4e93-8b37-4dd23e4a5032-images\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858013 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858055 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263-metrics-tls\") pod \"dns-operator-744455d44c-l75ns\" (UID: \"7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263\") " pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858076 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c34009-3606-4b93-9f5f-c8a478aee354-service-ca-bundle\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858101 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107eeea6-61f3-4cc7-b51c-f3100d84f707-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bxkng\" (UID: \"107eeea6-61f3-4cc7-b51c-f3100d84f707\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858124 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858145 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9953c224-6f16-4a0d-ad9a-3ea1e4914499-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g9mfz\" (UID: \"9953c224-6f16-4a0d-ad9a-3ea1e4914499\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858167 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-etcd-serving-ca\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858190 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-audit\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858211 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858244 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77477a76-df54-4755-89b0-9b2ec40e098d-serving-cert\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858268 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgm5\" (UniqueName: \"kubernetes.io/projected/423ab6bf-9cad-43cd-af44-e0cee05b262b-kube-api-access-rkgm5\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858291 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d55c18b9-bb80-428c-95ea-f21c6b0694e4-etcd-client\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858313 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858339 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858363 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kxt7\" (UniqueName: \"kubernetes.io/projected/8a01a344-a2a2-4d3c-9bc3-5e911936606c-kube-api-access-6kxt7\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858387 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/30b30333-fbb1-476f-8bf2-146f2ee696a7-tmpfs\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858409 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4wl8\" (UniqueName: \"kubernetes.io/projected/30b30333-fbb1-476f-8bf2-146f2ee696a7-kube-api-access-x4wl8\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858432 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-config\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858456 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km5sj\" (UniqueName: \"kubernetes.io/projected/78e98b3d-733f-4b7a-abcf-950d6870c04f-kube-api-access-km5sj\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858488 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-oauth-config\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858510 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-client-ca\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858536 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858561 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8944b4a9-8e7d-4a3a-a526-11a31e795453-srv-cert\") pod \"olm-operator-6b444d44fb-frm99\" (UID: \"8944b4a9-8e7d-4a3a-a526-11a31e795453\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858583 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858604 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09c34009-3606-4b93-9f5f-c8a478aee354-default-certificate\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858626 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c34009-3606-4b93-9f5f-c8a478aee354-metrics-certs\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858647 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91f26391-3d69-4625-a15b-286b29e14161-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tbt5x\" (UID: \"91f26391-3d69-4625-a15b-286b29e14161\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858679 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d55c18b9-bb80-428c-95ea-f21c6b0694e4-node-pullsecrets\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858700 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-policies\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858724 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-etcd-ca\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858746 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f26391-3d69-4625-a15b-286b29e14161-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tbt5x\" (UID: \"91f26391-3d69-4625-a15b-286b29e14161\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858770 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8252e47a-59aa-4a86-8f97-d12cc21fc6d8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmhs2\" (UID: \"8252e47a-59aa-4a86-8f97-d12cc21fc6d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858783 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858793 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8252e47a-59aa-4a86-8f97-d12cc21fc6d8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmhs2\" (UID: \"8252e47a-59aa-4a86-8f97-d12cc21fc6d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858819 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc4vw\" (UniqueName: \"kubernetes.io/projected/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-kube-api-access-rc4vw\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858840 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc56f\" (UniqueName: \"kubernetes.io/projected/107eeea6-61f3-4cc7-b51c-f3100d84f707-kube-api-access-tc56f\") pod \"openshift-apiserver-operator-796bbdcf4f-bxkng\" (UID: \"107eeea6-61f3-4cc7-b51c-f3100d84f707\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858866 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr852\" (UniqueName: \"kubernetes.io/projected/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-kube-api-access-dr852\") pod \"marketplace-operator-79b997595-nld56\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858889 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f26391-3d69-4625-a15b-286b29e14161-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tbt5x\" (UID: \"91f26391-3d69-4625-a15b-286b29e14161\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858902 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858913 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-etcd-client\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858937 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09e6c287-3ef1-4ea0-87c9-54b59acfc772-serving-cert\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.858960 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4fxm\" (UniqueName: \"kubernetes.io/projected/573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390-kube-api-access-d4fxm\") pod \"service-ca-9c57cc56f-7hnpl\" (UID: \"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390\") " pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859006 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nld56\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859032 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/371d4f01-4337-4da2-8e72-7a79d2a7f98c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksgsk\" (UID: \"371d4f01-4337-4da2-8e72-7a79d2a7f98c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859195 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d55c18b9-bb80-428c-95ea-f21c6b0694e4-serving-cert\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859222 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwb8x\" (UniqueName: \"kubernetes.io/projected/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-kube-api-access-hwb8x\") pod \"collect-profiles-29401380-dlnxs\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859254 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30b30333-fbb1-476f-8bf2-146f2ee696a7-webhook-cert\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859276 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78e98b3d-733f-4b7a-abcf-950d6870c04f-serving-cert\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859298 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423ab6bf-9cad-43cd-af44-e0cee05b262b-config\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859318 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17977cba-cde5-4d42-9c64-1c37568db595-proxy-tls\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859336 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0c505da0-8f2d-4cb8-b93d-9351ea9f82ec-srv-cert\") pod \"catalog-operator-68c6474976-pzvqw\" (UID: \"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859356 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-config\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859402 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-serving-cert\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859424 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859448 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scc9c\" (UniqueName: \"kubernetes.io/projected/0c505da0-8f2d-4cb8-b93d-9351ea9f82ec-kube-api-access-scc9c\") pod \"catalog-operator-68c6474976-pzvqw\" (UID: \"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859467 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09c34009-3606-4b93-9f5f-c8a478aee354-stats-auth\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859486 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859509 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs5mv\" (UniqueName: \"kubernetes.io/projected/09e6c287-3ef1-4ea0-87c9-54b59acfc772-kube-api-access-fs5mv\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859588 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-service-ca\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859609 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/423ab6bf-9cad-43cd-af44-e0cee05b262b-serving-cert\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859647 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97be63b-6ce3-44ae-b820-25260bf392bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2t52f\" (UID: \"d97be63b-6ce3-44ae-b820-25260bf392bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859673 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0c505da0-8f2d-4cb8-b93d-9351ea9f82ec-profile-collector-cert\") pod \"catalog-operator-68c6474976-pzvqw\" (UID: \"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859708 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859734 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09e6c287-3ef1-4ea0-87c9-54b59acfc772-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859780 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859804 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-machine-approver-tls\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859826 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-etcd-service-ca\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859851 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/107eeea6-61f3-4cc7-b51c-f3100d84f707-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bxkng\" (UID: \"107eeea6-61f3-4cc7-b51c-f3100d84f707\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859872 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nld56\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.859989 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqrjc\" (UniqueName: \"kubernetes.io/projected/17977cba-cde5-4d42-9c64-1c37568db595-kube-api-access-tqrjc\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860020 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-secret-volume\") pod \"collect-profiles-29401380-dlnxs\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860081 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-config\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860107 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8944b4a9-8e7d-4a3a-a526-11a31e795453-profile-collector-cert\") pod \"olm-operator-6b444d44fb-frm99\" (UID: \"8944b4a9-8e7d-4a3a-a526-11a31e795453\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860145 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp8f4\" (UniqueName: \"kubernetes.io/projected/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-kube-api-access-wp8f4\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860180 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-image-import-ca\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860217 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317839b7-786d-4e93-8b37-4dd23e4a5032-config\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860253 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-config\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860285 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-client-ca\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860312 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfllt\" (UniqueName: \"kubernetes.io/projected/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-kube-api-access-rfllt\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860346 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860349 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-serving-cert\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860374 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17977cba-cde5-4d42-9c64-1c37568db595-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860412 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390-signing-cabundle\") pod \"service-ca-9c57cc56f-7hnpl\" (UID: \"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390\") " pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860435 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vznsz\" (UniqueName: \"kubernetes.io/projected/09c34009-3606-4b93-9f5f-c8a478aee354-kube-api-access-vznsz\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860472 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77rvq\" (UniqueName: \"kubernetes.io/projected/d97be63b-6ce3-44ae-b820-25260bf392bf-kube-api-access-77rvq\") pod \"package-server-manager-789f6589d5-2t52f\" (UID: \"d97be63b-6ce3-44ae-b820-25260bf392bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860498 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ee3b773-c658-424e-ab4d-c6ba3e866ce2-proxy-tls\") pod \"machine-config-controller-84d6567774-gb2gn\" (UID: \"4ee3b773-c658-424e-ab4d-c6ba3e866ce2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860524 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8wgm\" (UniqueName: \"kubernetes.io/projected/317839b7-786d-4e93-8b37-4dd23e4a5032-kube-api-access-x8wgm\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860547 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq6vz\" (UniqueName: \"kubernetes.io/projected/8944b4a9-8e7d-4a3a-a526-11a31e795453-kube-api-access-lq6vz\") pod \"olm-operator-6b444d44fb-frm99\" (UID: \"8944b4a9-8e7d-4a3a-a526-11a31e795453\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860569 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860621 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30b30333-fbb1-476f-8bf2-146f2ee696a7-apiservice-cert\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860653 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grzj4\" (UniqueName: \"kubernetes.io/projected/8252e47a-59aa-4a86-8f97-d12cc21fc6d8-kube-api-access-grzj4\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmhs2\" (UID: \"8252e47a-59aa-4a86-8f97-d12cc21fc6d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860679 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p52vj\" (UniqueName: \"kubernetes.io/projected/9953c224-6f16-4a0d-ad9a-3ea1e4914499-kube-api-access-p52vj\") pod \"openshift-controller-manager-operator-756b6f6bc6-g9mfz\" (UID: \"9953c224-6f16-4a0d-ad9a-3ea1e4914499\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860703 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423ab6bf-9cad-43cd-af44-e0cee05b262b-trusted-ca\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860723 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860781 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccg6s\" (UniqueName: \"kubernetes.io/projected/b6686556-54b1-4232-a5ec-7bacd966ff86-kube-api-access-ccg6s\") pod \"downloads-7954f5f757-2pf5t\" (UID: \"b6686556-54b1-4232-a5ec-7bacd966ff86\") " pod="openshift-console/downloads-7954f5f757-2pf5t" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860818 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860842 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-trusted-ca-bundle\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860877 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e6c287-3ef1-4ea0-87c9-54b59acfc772-config\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860899 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4ee3b773-c658-424e-ab4d-c6ba3e866ce2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gb2gn\" (UID: \"4ee3b773-c658-424e-ab4d-c6ba3e866ce2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860925 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-oauth-serving-cert\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860943 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.860991 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17977cba-cde5-4d42-9c64-1c37568db595-images\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.861020 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf75w\" (UniqueName: \"kubernetes.io/projected/371d4f01-4337-4da2-8e72-7a79d2a7f98c-kube-api-access-cf75w\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksgsk\" (UID: \"371d4f01-4337-4da2-8e72-7a79d2a7f98c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.861045 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.861068 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l87vz\" (UniqueName: \"kubernetes.io/projected/d55c18b9-bb80-428c-95ea-f21c6b0694e4-kube-api-access-l87vz\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.861112 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzlbd\" (UniqueName: \"kubernetes.io/projected/7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263-kube-api-access-xzlbd\") pod \"dns-operator-744455d44c-l75ns\" (UID: \"7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263\") " pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.861135 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzzwr\" (UniqueName: \"kubernetes.io/projected/4ee3b773-c658-424e-ab4d-c6ba3e866ce2-kube-api-access-pzzwr\") pod \"machine-config-controller-84d6567774-gb2gn\" (UID: \"4ee3b773-c658-424e-ab4d-c6ba3e866ce2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.861156 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.861179 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.861201 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-config-volume\") pod \"collect-profiles-29401380-dlnxs\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.861222 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7pjw\" (UniqueName: \"kubernetes.io/projected/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-kube-api-access-t7pjw\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.861246 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.869872 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.872161 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.873097 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.873468 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.876785 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.877497 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.886681 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.889396 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.890003 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.890415 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.890643 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.891377 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.895011 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.898333 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gjnb4"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.899463 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gjnb4" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.901333 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.912144 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b6gl2"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.912189 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.914619 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2pf5t"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.915854 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.927012 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.927327 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jv944"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.928823 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.930400 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l75ns"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.934208 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22g9m"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.934294 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.934310 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.937241 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7hnpl"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.938665 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.941084 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fjgbs"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.947313 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.950492 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qvn4k"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.951023 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.952862 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5mpvj"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.954444 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.955291 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.956927 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-67kvn"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962497 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-config\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962526 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-client-ca\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962544 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfllt\" (UniqueName: \"kubernetes.io/projected/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-kube-api-access-rfllt\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962565 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-audit-dir\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962582 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17977cba-cde5-4d42-9c64-1c37568db595-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962598 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-serving-cert\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962617 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30f89119-a5f1-41eb-822c-fdd243818a4b-config\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962634 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vznsz\" (UniqueName: \"kubernetes.io/projected/09c34009-3606-4b93-9f5f-c8a478aee354-kube-api-access-vznsz\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962651 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77rvq\" (UniqueName: \"kubernetes.io/projected/d97be63b-6ce3-44ae-b820-25260bf392bf-kube-api-access-77rvq\") pod \"package-server-manager-789f6589d5-2t52f\" (UID: \"d97be63b-6ce3-44ae-b820-25260bf392bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962671 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390-signing-cabundle\") pod \"service-ca-9c57cc56f-7hnpl\" (UID: \"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390\") " pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962686 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-encryption-config\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962703 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ee3b773-c658-424e-ab4d-c6ba3e866ce2-proxy-tls\") pod \"machine-config-controller-84d6567774-gb2gn\" (UID: \"4ee3b773-c658-424e-ab4d-c6ba3e866ce2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962718 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgr2\" (UniqueName: \"kubernetes.io/projected/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-kube-api-access-xlgr2\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962733 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8wgm\" (UniqueName: \"kubernetes.io/projected/317839b7-786d-4e93-8b37-4dd23e4a5032-kube-api-access-x8wgm\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962749 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq6vz\" (UniqueName: \"kubernetes.io/projected/8944b4a9-8e7d-4a3a-a526-11a31e795453-kube-api-access-lq6vz\") pod \"olm-operator-6b444d44fb-frm99\" (UID: \"8944b4a9-8e7d-4a3a-a526-11a31e795453\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962763 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962779 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439f7d1e-4721-42db-90a8-d9ce7bfb2302-config\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962797 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30b30333-fbb1-476f-8bf2-146f2ee696a7-apiservice-cert\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962814 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grzj4\" (UniqueName: \"kubernetes.io/projected/8252e47a-59aa-4a86-8f97-d12cc21fc6d8-kube-api-access-grzj4\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmhs2\" (UID: \"8252e47a-59aa-4a86-8f97-d12cc21fc6d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962832 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p52vj\" (UniqueName: \"kubernetes.io/projected/9953c224-6f16-4a0d-ad9a-3ea1e4914499-kube-api-access-p52vj\") pod \"openshift-controller-manager-operator-756b6f6bc6-g9mfz\" (UID: \"9953c224-6f16-4a0d-ad9a-3ea1e4914499\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962848 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423ab6bf-9cad-43cd-af44-e0cee05b262b-trusted-ca\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962862 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccg6s\" (UniqueName: \"kubernetes.io/projected/b6686556-54b1-4232-a5ec-7bacd966ff86-kube-api-access-ccg6s\") pod \"downloads-7954f5f757-2pf5t\" (UID: \"b6686556-54b1-4232-a5ec-7bacd966ff86\") " pod="openshift-console/downloads-7954f5f757-2pf5t" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962877 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-trusted-ca-bundle\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962894 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962924 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17977cba-cde5-4d42-9c64-1c37568db595-images\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962940 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e6c287-3ef1-4ea0-87c9-54b59acfc772-config\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962959 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4ee3b773-c658-424e-ab4d-c6ba3e866ce2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gb2gn\" (UID: \"4ee3b773-c658-424e-ab4d-c6ba3e866ce2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.962996 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-oauth-serving-cert\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963021 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf75w\" (UniqueName: \"kubernetes.io/projected/371d4f01-4337-4da2-8e72-7a79d2a7f98c-kube-api-access-cf75w\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksgsk\" (UID: \"371d4f01-4337-4da2-8e72-7a79d2a7f98c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963042 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l87vz\" (UniqueName: \"kubernetes.io/projected/d55c18b9-bb80-428c-95ea-f21c6b0694e4-kube-api-access-l87vz\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963061 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963082 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzlbd\" (UniqueName: \"kubernetes.io/projected/7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263-kube-api-access-xzlbd\") pod \"dns-operator-744455d44c-l75ns\" (UID: \"7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263\") " pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963102 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzzwr\" (UniqueName: \"kubernetes.io/projected/4ee3b773-c658-424e-ab4d-c6ba3e866ce2-kube-api-access-pzzwr\") pod \"machine-config-controller-84d6567774-gb2gn\" (UID: \"4ee3b773-c658-424e-ab4d-c6ba3e866ce2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963128 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963150 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963172 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9f89ef7-da88-4a5b-a0ca-5474493bf82f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xtpw9\" (UID: \"a9f89ef7-da88-4a5b-a0ca-5474493bf82f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963191 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ded91b12-10e5-4c46-aaa7-e54c34072789-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lcspp\" (UID: \"ded91b12-10e5-4c46-aaa7-e54c34072789\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963214 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-config-volume\") pod \"collect-profiles-29401380-dlnxs\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963232 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7pjw\" (UniqueName: \"kubernetes.io/projected/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-kube-api-access-t7pjw\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963248 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnlzk\" (UniqueName: \"kubernetes.io/projected/ded91b12-10e5-4c46-aaa7-e54c34072789-kube-api-access-hnlzk\") pod \"multus-admission-controller-857f4d67dd-lcspp\" (UID: \"ded91b12-10e5-4c46-aaa7-e54c34072789\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963266 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963283 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09e6c287-3ef1-4ea0-87c9-54b59acfc772-service-ca-bundle\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963298 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390-signing-key\") pod \"service-ca-9c57cc56f-7hnpl\" (UID: \"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390\") " pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963314 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgbdl\" (UniqueName: \"kubernetes.io/projected/77477a76-df54-4755-89b0-9b2ec40e098d-kube-api-access-lgbdl\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963328 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d55c18b9-bb80-428c-95ea-f21c6b0694e4-encryption-config\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963345 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d55c18b9-bb80-428c-95ea-f21c6b0694e4-audit-dir\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963369 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-auth-proxy-config\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963392 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-config\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963417 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whbrg\" (UniqueName: \"kubernetes.io/projected/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-kube-api-access-whbrg\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963444 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/317839b7-786d-4e93-8b37-4dd23e4a5032-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963472 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9953c224-6f16-4a0d-ad9a-3ea1e4914499-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g9mfz\" (UID: \"9953c224-6f16-4a0d-ad9a-3ea1e4914499\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963496 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-config\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963518 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-dir\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963540 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzv29\" (UniqueName: \"kubernetes.io/projected/439f7d1e-4721-42db-90a8-d9ce7bfb2302-kube-api-access-nzv29\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963575 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/317839b7-786d-4e93-8b37-4dd23e4a5032-images\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963598 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963620 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/439f7d1e-4721-42db-90a8-d9ce7bfb2302-serving-cert\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963643 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263-metrics-tls\") pod \"dns-operator-744455d44c-l75ns\" (UID: \"7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263\") " pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963666 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c34009-3606-4b93-9f5f-c8a478aee354-service-ca-bundle\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963688 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107eeea6-61f3-4cc7-b51c-f3100d84f707-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bxkng\" (UID: \"107eeea6-61f3-4cc7-b51c-f3100d84f707\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963707 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963729 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9f89ef7-da88-4a5b-a0ca-5474493bf82f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xtpw9\" (UID: \"a9f89ef7-da88-4a5b-a0ca-5474493bf82f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963751 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9953c224-6f16-4a0d-ad9a-3ea1e4914499-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g9mfz\" (UID: \"9953c224-6f16-4a0d-ad9a-3ea1e4914499\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963773 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-etcd-serving-ca\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963805 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-audit\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963830 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963855 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thmhq\" (UniqueName: \"kubernetes.io/projected/d3d12281-f14b-432b-bc6a-7d8d4b9e933e-kube-api-access-thmhq\") pod \"openshift-config-operator-7777fb866f-2b4mm\" (UID: \"d3d12281-f14b-432b-bc6a-7d8d4b9e933e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963878 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77477a76-df54-4755-89b0-9b2ec40e098d-serving-cert\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963901 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30f89119-a5f1-41eb-822c-fdd243818a4b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963923 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963946 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgm5\" (UniqueName: \"kubernetes.io/projected/423ab6bf-9cad-43cd-af44-e0cee05b262b-kube-api-access-rkgm5\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.963986 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d55c18b9-bb80-428c-95ea-f21c6b0694e4-etcd-client\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964011 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964034 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964058 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kxt7\" (UniqueName: \"kubernetes.io/projected/8a01a344-a2a2-4d3c-9bc3-5e911936606c-kube-api-access-6kxt7\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964078 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/30b30333-fbb1-476f-8bf2-146f2ee696a7-tmpfs\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964100 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4wl8\" (UniqueName: \"kubernetes.io/projected/30b30333-fbb1-476f-8bf2-146f2ee696a7-kube-api-access-x4wl8\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964122 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-config\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964145 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km5sj\" (UniqueName: \"kubernetes.io/projected/78e98b3d-733f-4b7a-abcf-950d6870c04f-kube-api-access-km5sj\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964167 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-oauth-config\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964191 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-client-ca\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964216 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964242 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jzf\" (UniqueName: \"kubernetes.io/projected/6e339e05-6937-4fca-b8f6-060917c05b2c-kube-api-access-46jzf\") pod \"migrator-59844c95c7-wb26t\" (UID: \"6e339e05-6937-4fca-b8f6-060917c05b2c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964265 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8944b4a9-8e7d-4a3a-a526-11a31e795453-srv-cert\") pod \"olm-operator-6b444d44fb-frm99\" (UID: \"8944b4a9-8e7d-4a3a-a526-11a31e795453\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964288 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964310 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09c34009-3606-4b93-9f5f-c8a478aee354-default-certificate\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964333 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91f26391-3d69-4625-a15b-286b29e14161-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tbt5x\" (UID: \"91f26391-3d69-4625-a15b-286b29e14161\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964356 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c34009-3606-4b93-9f5f-c8a478aee354-metrics-certs\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964381 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p822h\" (UniqueName: \"kubernetes.io/projected/547f6e0d-0e57-4ebd-94bb-1cff5d6fedef-kube-api-access-p822h\") pod \"cluster-samples-operator-665b6dd947-kd826\" (UID: \"547f6e0d-0e57-4ebd-94bb-1cff5d6fedef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964416 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d55c18b9-bb80-428c-95ea-f21c6b0694e4-node-pullsecrets\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964439 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-policies\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964463 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-etcd-ca\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964485 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f26391-3d69-4625-a15b-286b29e14161-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tbt5x\" (UID: \"91f26391-3d69-4625-a15b-286b29e14161\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964508 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8252e47a-59aa-4a86-8f97-d12cc21fc6d8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmhs2\" (UID: \"8252e47a-59aa-4a86-8f97-d12cc21fc6d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964531 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8252e47a-59aa-4a86-8f97-d12cc21fc6d8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmhs2\" (UID: \"8252e47a-59aa-4a86-8f97-d12cc21fc6d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964553 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-audit-policies\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964580 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc4vw\" (UniqueName: \"kubernetes.io/projected/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-kube-api-access-rc4vw\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964606 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc56f\" (UniqueName: \"kubernetes.io/projected/107eeea6-61f3-4cc7-b51c-f3100d84f707-kube-api-access-tc56f\") pod \"openshift-apiserver-operator-796bbdcf4f-bxkng\" (UID: \"107eeea6-61f3-4cc7-b51c-f3100d84f707\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964628 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr852\" (UniqueName: \"kubernetes.io/projected/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-kube-api-access-dr852\") pod \"marketplace-operator-79b997595-nld56\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964651 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3d12281-f14b-432b-bc6a-7d8d4b9e933e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2b4mm\" (UID: \"d3d12281-f14b-432b-bc6a-7d8d4b9e933e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964676 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f26391-3d69-4625-a15b-286b29e14161-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tbt5x\" (UID: \"91f26391-3d69-4625-a15b-286b29e14161\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964699 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs8vn\" (UniqueName: \"kubernetes.io/projected/74a6fca6-52bf-4568-baa8-5bbcd0904722-kube-api-access-qs8vn\") pod \"ingress-canary-gjnb4\" (UID: \"74a6fca6-52bf-4568-baa8-5bbcd0904722\") " pod="openshift-ingress-canary/ingress-canary-gjnb4" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964722 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/547f6e0d-0e57-4ebd-94bb-1cff5d6fedef-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kd826\" (UID: \"547f6e0d-0e57-4ebd-94bb-1cff5d6fedef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964747 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-etcd-client\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964770 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09e6c287-3ef1-4ea0-87c9-54b59acfc772-serving-cert\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964794 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964816 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-trusted-ca\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964842 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4fxm\" (UniqueName: \"kubernetes.io/projected/573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390-kube-api-access-d4fxm\") pod \"service-ca-9c57cc56f-7hnpl\" (UID: \"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390\") " pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964865 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nld56\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964889 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/371d4f01-4337-4da2-8e72-7a79d2a7f98c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksgsk\" (UID: \"371d4f01-4337-4da2-8e72-7a79d2a7f98c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964914 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-client\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964924 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17977cba-cde5-4d42-9c64-1c37568db595-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964938 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3d12281-f14b-432b-bc6a-7d8d4b9e933e-serving-cert\") pod \"openshift-config-operator-7777fb866f-2b4mm\" (UID: \"d3d12281-f14b-432b-bc6a-7d8d4b9e933e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964937 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-client-ca\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.964988 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.965034 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d55c18b9-bb80-428c-95ea-f21c6b0694e4-serving-cert\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.965689 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwb8x\" (UniqueName: \"kubernetes.io/projected/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-kube-api-access-hwb8x\") pod \"collect-profiles-29401380-dlnxs\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.965722 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30b30333-fbb1-476f-8bf2-146f2ee696a7-webhook-cert\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.965746 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78e98b3d-733f-4b7a-abcf-950d6870c04f-serving-cert\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.965768 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423ab6bf-9cad-43cd-af44-e0cee05b262b-config\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.965789 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.965813 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17977cba-cde5-4d42-9c64-1c37568db595-proxy-tls\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.965834 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0c505da0-8f2d-4cb8-b93d-9351ea9f82ec-srv-cert\") pod \"catalog-operator-68c6474976-pzvqw\" (UID: \"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.965855 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-config\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.965876 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-serving-cert\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.966750 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423ab6bf-9cad-43cd-af44-e0cee05b262b-trusted-ca\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.966794 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scc9c\" (UniqueName: \"kubernetes.io/projected/0c505da0-8f2d-4cb8-b93d-9351ea9f82ec-kube-api-access-scc9c\") pod \"catalog-operator-68c6474976-pzvqw\" (UID: \"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.966994 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09c34009-3606-4b93-9f5f-c8a478aee354-stats-auth\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.967271 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.967394 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-serving-cert\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.967487 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs5mv\" (UniqueName: \"kubernetes.io/projected/09e6c287-3ef1-4ea0-87c9-54b59acfc772-kube-api-access-fs5mv\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.967572 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a6fca6-52bf-4568-baa8-5bbcd0904722-cert\") pod \"ingress-canary-gjnb4\" (UID: \"74a6fca6-52bf-4568-baa8-5bbcd0904722\") " pod="openshift-ingress-canary/ingress-canary-gjnb4" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.967649 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/423ab6bf-9cad-43cd-af44-e0cee05b262b-serving-cert\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.967725 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-service-ca\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.967808 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f89ef7-da88-4a5b-a0ca-5474493bf82f-config\") pod \"kube-controller-manager-operator-78b949d7b-xtpw9\" (UID: \"a9f89ef7-da88-4a5b-a0ca-5474493bf82f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.967928 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97be63b-6ce3-44ae-b820-25260bf392bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2t52f\" (UID: \"d97be63b-6ce3-44ae-b820-25260bf392bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968038 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0c505da0-8f2d-4cb8-b93d-9351ea9f82ec-profile-collector-cert\") pod \"catalog-operator-68c6474976-pzvqw\" (UID: \"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968114 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09e6c287-3ef1-4ea0-87c9-54b59acfc772-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968183 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968259 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30f89119-a5f1-41eb-822c-fdd243818a4b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968335 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-machine-approver-tls\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968402 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-etcd-service-ca\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968450 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17977cba-cde5-4d42-9c64-1c37568db595-images\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968483 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/107eeea6-61f3-4cc7-b51c-f3100d84f707-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bxkng\" (UID: \"107eeea6-61f3-4cc7-b51c-f3100d84f707\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968543 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nld56\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968569 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-metrics-tls\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968609 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqrjc\" (UniqueName: \"kubernetes.io/projected/17977cba-cde5-4d42-9c64-1c37568db595-kube-api-access-tqrjc\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968628 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-secret-volume\") pod \"collect-profiles-29401380-dlnxs\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968651 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-config\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968670 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8944b4a9-8e7d-4a3a-a526-11a31e795453-profile-collector-cert\") pod \"olm-operator-6b444d44fb-frm99\" (UID: \"8944b4a9-8e7d-4a3a-a526-11a31e795453\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968690 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp8f4\" (UniqueName: \"kubernetes.io/projected/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-kube-api-access-wp8f4\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968709 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-image-import-ca\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.968738 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.969485 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e6c287-3ef1-4ea0-87c9-54b59acfc772-config\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.970178 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4ee3b773-c658-424e-ab4d-c6ba3e866ce2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gb2gn\" (UID: \"4ee3b773-c658-424e-ab4d-c6ba3e866ce2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.970766 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-oauth-serving-cert\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.971722 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-config\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.972773 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-trusted-ca-bundle\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.973132 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-config\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.973879 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.977846 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pdprk"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.978072 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.978157 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gjnb4"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.976714 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09e6c287-3ef1-4ea0-87c9-54b59acfc772-service-ca-bundle\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.974897 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.976423 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d55c18b9-bb80-428c-95ea-f21c6b0694e4-serving-cert\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.979045 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/30b30333-fbb1-476f-8bf2-146f2ee696a7-tmpfs\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.974325 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.979319 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.979753 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107eeea6-61f3-4cc7-b51c-f3100d84f707-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bxkng\" (UID: \"107eeea6-61f3-4cc7-b51c-f3100d84f707\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.981214 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.981535 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-serving-cert\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.982578 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-client-ca\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.982637 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-service-ca\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.982628 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d55c18b9-bb80-428c-95ea-f21c6b0694e4-audit-dir\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.983622 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09e6c287-3ef1-4ea0-87c9-54b59acfc772-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.984134 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.984659 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/107eeea6-61f3-4cc7-b51c-f3100d84f707-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bxkng\" (UID: \"107eeea6-61f3-4cc7-b51c-f3100d84f707\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.986197 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-etcd-service-ca\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.986656 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d55c18b9-bb80-428c-95ea-f21c6b0694e4-node-pullsecrets\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.987361 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-auth-proxy-config\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.987416 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rgvb"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.987451 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.987886 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.988347 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-policies\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.988837 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nld56"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.989093 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/317839b7-786d-4e93-8b37-4dd23e4a5032-images\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.989589 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-etcd-ca\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.990664 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-config\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.990804 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-config\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.990871 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-dir\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.991545 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f26391-3d69-4625-a15b-286b29e14161-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tbt5x\" (UID: \"91f26391-3d69-4625-a15b-286b29e14161\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.991769 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97be63b-6ce3-44ae-b820-25260bf392bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2t52f\" (UID: \"d97be63b-6ce3-44ae-b820-25260bf392bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.992223 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.992753 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.993102 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-machine-approver-tls\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.993517 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8944b4a9-8e7d-4a3a-a526-11a31e795453-srv-cert\") pod \"olm-operator-6b444d44fb-frm99\" (UID: \"8944b4a9-8e7d-4a3a-a526-11a31e795453\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.993669 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0c505da0-8f2d-4cb8-b93d-9351ea9f82ec-profile-collector-cert\") pod \"catalog-operator-68c6474976-pzvqw\" (UID: \"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.994419 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/317839b7-786d-4e93-8b37-4dd23e4a5032-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.994598 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.997394 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-config\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.997747 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.997946 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t99cv"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.998278 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263-metrics-tls\") pod \"dns-operator-744455d44c-l75ns\" (UID: \"7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263\") " pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.998572 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317839b7-786d-4e93-8b37-4dd23e4a5032-config\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.998595 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d55c18b9-bb80-428c-95ea-f21c6b0694e4-encryption-config\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.999044 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ee3b773-c658-424e-ab4d-c6ba3e866ce2-proxy-tls\") pod \"machine-config-controller-84d6567774-gb2gn\" (UID: \"4ee3b773-c658-424e-ab4d-c6ba3e866ce2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.999207 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826"] Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.999218 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317839b7-786d-4e93-8b37-4dd23e4a5032-config\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:36 crc kubenswrapper[4965]: I1125 15:06:36.999512 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-image-import-ca\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.000257 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.000504 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-etcd-serving-ca\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.001643 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.002212 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.002482 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f26391-3d69-4625-a15b-286b29e14161-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tbt5x\" (UID: \"91f26391-3d69-4625-a15b-286b29e14161\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.002693 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d55c18b9-bb80-428c-95ea-f21c6b0694e4-audit\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.003123 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-config\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.003234 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423ab6bf-9cad-43cd-af44-e0cee05b262b-config\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.003927 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.005134 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77477a76-df54-4755-89b0-9b2ec40e098d-serving-cert\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.005396 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.005799 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9953c224-6f16-4a0d-ad9a-3ea1e4914499-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g9mfz\" (UID: \"9953c224-6f16-4a0d-ad9a-3ea1e4914499\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.006382 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.006482 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0c505da0-8f2d-4cb8-b93d-9351ea9f82ec-srv-cert\") pod \"catalog-operator-68c6474976-pzvqw\" (UID: \"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.007175 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-serving-cert\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.008028 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.009502 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mlsxj"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.010819 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d55c18b9-bb80-428c-95ea-f21c6b0694e4-etcd-client\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.010993 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09e6c287-3ef1-4ea0-87c9-54b59acfc772-serving-cert\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.011148 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-etcd-client\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.011377 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8944b4a9-8e7d-4a3a-a526-11a31e795453-profile-collector-cert\") pod \"olm-operator-6b444d44fb-frm99\" (UID: \"8944b4a9-8e7d-4a3a-a526-11a31e795453\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.011813 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78e98b3d-733f-4b7a-abcf-950d6870c04f-serving-cert\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.012651 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.013086 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.013468 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.013535 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-oauth-config\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.013787 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-secret-volume\") pod \"collect-profiles-29401380-dlnxs\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.013956 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/423ab6bf-9cad-43cd-af44-e0cee05b262b-serving-cert\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.014215 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17977cba-cde5-4d42-9c64-1c37568db595-proxy-tls\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.014249 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.015553 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.015607 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.017181 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.018718 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mlsxj"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.020960 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.022505 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lcspp"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.023499 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.024453 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t99cv"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.025231 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.026025 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.027039 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-lxkkv"] Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.027700 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.031732 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9953c224-6f16-4a0d-ad9a-3ea1e4914499-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g9mfz\" (UID: \"9953c224-6f16-4a0d-ad9a-3ea1e4914499\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.056577 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.065003 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.084956 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.085986 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390-signing-cabundle\") pod \"service-ca-9c57cc56f-7hnpl\" (UID: \"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390\") " pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.099131 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9f89ef7-da88-4a5b-a0ca-5474493bf82f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xtpw9\" (UID: \"a9f89ef7-da88-4a5b-a0ca-5474493bf82f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.099290 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ded91b12-10e5-4c46-aaa7-e54c34072789-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lcspp\" (UID: \"ded91b12-10e5-4c46-aaa7-e54c34072789\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.099803 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnlzk\" (UniqueName: \"kubernetes.io/projected/ded91b12-10e5-4c46-aaa7-e54c34072789-kube-api-access-hnlzk\") pod \"multus-admission-controller-857f4d67dd-lcspp\" (UID: \"ded91b12-10e5-4c46-aaa7-e54c34072789\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.099912 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whbrg\" (UniqueName: \"kubernetes.io/projected/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-kube-api-access-whbrg\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100007 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzv29\" (UniqueName: \"kubernetes.io/projected/439f7d1e-4721-42db-90a8-d9ce7bfb2302-kube-api-access-nzv29\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100097 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/439f7d1e-4721-42db-90a8-d9ce7bfb2302-serving-cert\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100187 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9f89ef7-da88-4a5b-a0ca-5474493bf82f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xtpw9\" (UID: \"a9f89ef7-da88-4a5b-a0ca-5474493bf82f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100276 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thmhq\" (UniqueName: \"kubernetes.io/projected/d3d12281-f14b-432b-bc6a-7d8d4b9e933e-kube-api-access-thmhq\") pod \"openshift-config-operator-7777fb866f-2b4mm\" (UID: \"d3d12281-f14b-432b-bc6a-7d8d4b9e933e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100356 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30f89119-a5f1-41eb-822c-fdd243818a4b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100434 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100533 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46jzf\" (UniqueName: \"kubernetes.io/projected/6e339e05-6937-4fca-b8f6-060917c05b2c-kube-api-access-46jzf\") pod \"migrator-59844c95c7-wb26t\" (UID: \"6e339e05-6937-4fca-b8f6-060917c05b2c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100640 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p822h\" (UniqueName: \"kubernetes.io/projected/547f6e0d-0e57-4ebd-94bb-1cff5d6fedef-kube-api-access-p822h\") pod \"cluster-samples-operator-665b6dd947-kd826\" (UID: \"547f6e0d-0e57-4ebd-94bb-1cff5d6fedef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100735 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-audit-policies\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100839 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3d12281-f14b-432b-bc6a-7d8d4b9e933e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2b4mm\" (UID: \"d3d12281-f14b-432b-bc6a-7d8d4b9e933e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.100919 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs8vn\" (UniqueName: \"kubernetes.io/projected/74a6fca6-52bf-4568-baa8-5bbcd0904722-kube-api-access-qs8vn\") pod \"ingress-canary-gjnb4\" (UID: \"74a6fca6-52bf-4568-baa8-5bbcd0904722\") " pod="openshift-ingress-canary/ingress-canary-gjnb4" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.101110 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/547f6e0d-0e57-4ebd-94bb-1cff5d6fedef-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kd826\" (UID: \"547f6e0d-0e57-4ebd-94bb-1cff5d6fedef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.101181 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.101254 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-trusted-ca\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.101450 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-client\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.101524 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3d12281-f14b-432b-bc6a-7d8d4b9e933e-serving-cert\") pod \"openshift-config-operator-7777fb866f-2b4mm\" (UID: \"d3d12281-f14b-432b-bc6a-7d8d4b9e933e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.101698 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-serving-cert\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.101874 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a6fca6-52bf-4568-baa8-5bbcd0904722-cert\") pod \"ingress-canary-gjnb4\" (UID: \"74a6fca6-52bf-4568-baa8-5bbcd0904722\") " pod="openshift-ingress-canary/ingress-canary-gjnb4" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.101947 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f89ef7-da88-4a5b-a0ca-5474493bf82f-config\") pod \"kube-controller-manager-operator-78b949d7b-xtpw9\" (UID: \"a9f89ef7-da88-4a5b-a0ca-5474493bf82f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.102138 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30f89119-a5f1-41eb-822c-fdd243818a4b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.102227 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-metrics-tls\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.102332 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.102491 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-audit-dir\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.102566 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30f89119-a5f1-41eb-822c-fdd243818a4b-config\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.102653 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-encryption-config\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.102727 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgr2\" (UniqueName: \"kubernetes.io/projected/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-kube-api-access-xlgr2\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.102814 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439f7d1e-4721-42db-90a8-d9ce7bfb2302-config\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.103671 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3d12281-f14b-432b-bc6a-7d8d4b9e933e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2b4mm\" (UID: \"d3d12281-f14b-432b-bc6a-7d8d4b9e933e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.106854 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.110934 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-audit-dir\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.124887 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.144981 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.165818 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.177647 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390-signing-key\") pod \"service-ca-9c57cc56f-7hnpl\" (UID: \"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390\") " pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.185118 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.207303 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.207680 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8252e47a-59aa-4a86-8f97-d12cc21fc6d8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmhs2\" (UID: \"8252e47a-59aa-4a86-8f97-d12cc21fc6d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.225804 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.236107 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8252e47a-59aa-4a86-8f97-d12cc21fc6d8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmhs2\" (UID: \"8252e47a-59aa-4a86-8f97-d12cc21fc6d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.245178 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.265683 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.286109 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.305422 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.326517 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.336644 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-config-volume\") pod \"collect-profiles-29401380-dlnxs\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.347754 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.365747 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.386466 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.404931 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.414556 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c34009-3606-4b93-9f5f-c8a478aee354-metrics-certs\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.426003 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.444729 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.453654 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/371d4f01-4337-4da2-8e72-7a79d2a7f98c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksgsk\" (UID: \"371d4f01-4337-4da2-8e72-7a79d2a7f98c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.465109 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.475621 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c34009-3606-4b93-9f5f-c8a478aee354-service-ca-bundle\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.484792 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.494870 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09c34009-3606-4b93-9f5f-c8a478aee354-stats-auth\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.504664 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.511509 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09c34009-3606-4b93-9f5f-c8a478aee354-default-certificate\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.524916 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.533727 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30b30333-fbb1-476f-8bf2-146f2ee696a7-apiservice-cert\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.533735 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30b30333-fbb1-476f-8bf2-146f2ee696a7-webhook-cert\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.546063 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.565768 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.577098 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nld56\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.590645 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.601864 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nld56\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.606008 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.625425 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.645062 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.664878 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.685540 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.704708 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.713661 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ded91b12-10e5-4c46-aaa7-e54c34072789-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lcspp\" (UID: \"ded91b12-10e5-4c46-aaa7-e54c34072789\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.725905 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.746647 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.765733 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.786074 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.804730 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.825637 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.837609 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-metrics-tls\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.846003 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.863585 4965 request.go:700] Waited for 1.011271773s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/configmaps?fieldSelector=metadata.name%3Dtrusted-ca&limit=500&resourceVersion=0 Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.872539 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.875878 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-trusted-ca\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.886024 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.925158 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.934685 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f89ef7-da88-4a5b-a0ca-5474493bf82f-config\") pod \"kube-controller-manager-operator-78b949d7b-xtpw9\" (UID: \"a9f89ef7-da88-4a5b-a0ca-5474493bf82f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.945344 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.965122 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.982942 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9f89ef7-da88-4a5b-a0ca-5474493bf82f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xtpw9\" (UID: \"a9f89ef7-da88-4a5b-a0ca-5474493bf82f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:37 crc kubenswrapper[4965]: I1125 15:06:37.985477 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.005727 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.024703 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.045983 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.067187 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.067328 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3d12281-f14b-432b-bc6a-7d8d4b9e933e-serving-cert\") pod \"openshift-config-operator-7777fb866f-2b4mm\" (UID: \"d3d12281-f14b-432b-bc6a-7d8d4b9e933e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.087725 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104186 4965 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104221 4965 secret.go:188] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104302 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/439f7d1e-4721-42db-90a8-d9ce7bfb2302-serving-cert podName:439f7d1e-4721-42db-90a8-d9ce7bfb2302 nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.604266367 +0000 UTC m=+143.571860113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/439f7d1e-4721-42db-90a8-d9ce7bfb2302-serving-cert") pod "service-ca-operator-777779d784-kp5tp" (UID: "439f7d1e-4721-42db-90a8-d9ce7bfb2302") : failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104323 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/547f6e0d-0e57-4ebd-94bb-1cff5d6fedef-samples-operator-tls podName:547f6e0d-0e57-4ebd-94bb-1cff5d6fedef nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.604314568 +0000 UTC m=+143.571908314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/547f6e0d-0e57-4ebd-94bb-1cff5d6fedef-samples-operator-tls") pod "cluster-samples-operator-665b6dd947-kd826" (UID: "547f6e0d-0e57-4ebd-94bb-1cff5d6fedef") : failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104194 4965 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104355 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-trusted-ca-bundle podName:ed9961f3-d3a0-47aa-a9c4-e311468d76ff nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.604347819 +0000 UTC m=+143.571941565 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-trusted-ca-bundle") pod "apiserver-7bbb656c7d-6xcqz" (UID: "ed9961f3-d3a0-47aa-a9c4-e311468d76ff") : failed to sync configmap cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104399 4965 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104431 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/30f89119-a5f1-41eb-822c-fdd243818a4b-config podName:30f89119-a5f1-41eb-822c-fdd243818a4b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.604422171 +0000 UTC m=+143.572015917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/30f89119-a5f1-41eb-822c-fdd243818a4b-config") pod "kube-apiserver-operator-766d6c64bb-wkf6l" (UID: "30f89119-a5f1-41eb-822c-fdd243818a4b") : failed to sync configmap cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104884 4965 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104913 4965 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104930 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-encryption-config podName:ed9961f3-d3a0-47aa-a9c4-e311468d76ff nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.604919495 +0000 UTC m=+143.572513241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-encryption-config") pod "apiserver-7bbb656c7d-6xcqz" (UID: "ed9961f3-d3a0-47aa-a9c4-e311468d76ff") : failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104952 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30f89119-a5f1-41eb-822c-fdd243818a4b-serving-cert podName:30f89119-a5f1-41eb-822c-fdd243818a4b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.604941586 +0000 UTC m=+143.572535332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/30f89119-a5f1-41eb-822c-fdd243818a4b-serving-cert") pod "kube-apiserver-operator-766d6c64bb-wkf6l" (UID: "30f89119-a5f1-41eb-822c-fdd243818a4b") : failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.104914 4965 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.105007 4965 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.105050 4965 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.106080 4965 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.106102 4965 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.106169 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/439f7d1e-4721-42db-90a8-d9ce7bfb2302-config podName:439f7d1e-4721-42db-90a8-d9ce7bfb2302 nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.606155499 +0000 UTC m=+143.573749245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/439f7d1e-4721-42db-90a8-d9ce7bfb2302-config") pod "service-ca-operator-777779d784-kp5tp" (UID: "439f7d1e-4721-42db-90a8-d9ce7bfb2302") : failed to sync configmap cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.106368 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-serving-ca podName:ed9961f3-d3a0-47aa-a9c4-e311468d76ff nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.606352954 +0000 UTC m=+143.573946710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-serving-ca") pod "apiserver-7bbb656c7d-6xcqz" (UID: "ed9961f3-d3a0-47aa-a9c4-e311468d76ff") : failed to sync configmap cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.106385 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-audit-policies podName:ed9961f3-d3a0-47aa-a9c4-e311468d76ff nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.606377635 +0000 UTC m=+143.573971381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-audit-policies") pod "apiserver-7bbb656c7d-6xcqz" (UID: "ed9961f3-d3a0-47aa-a9c4-e311468d76ff") : failed to sync configmap cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.106400 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74a6fca6-52bf-4568-baa8-5bbcd0904722-cert podName:74a6fca6-52bf-4568-baa8-5bbcd0904722 nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.606392955 +0000 UTC m=+143.573986701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74a6fca6-52bf-4568-baa8-5bbcd0904722-cert") pod "ingress-canary-gjnb4" (UID: "74a6fca6-52bf-4568-baa8-5bbcd0904722") : failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.106415 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-serving-cert podName:ed9961f3-d3a0-47aa-a9c4-e311468d76ff nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.606409516 +0000 UTC m=+143.574003262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-serving-cert") pod "apiserver-7bbb656c7d-6xcqz" (UID: "ed9961f3-d3a0-47aa-a9c4-e311468d76ff") : failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.106774 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.107993 4965 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: E1125 15:06:38.108057 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-client podName:ed9961f3-d3a0-47aa-a9c4-e311468d76ff nodeName:}" failed. No retries permitted until 2025-11-25 15:06:38.608026861 +0000 UTC m=+143.575620607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-client") pod "apiserver-7bbb656c7d-6xcqz" (UID: "ed9961f3-d3a0-47aa-a9c4-e311468d76ff") : failed to sync secret cache: timed out waiting for the condition Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.126001 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.145574 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.166087 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.185595 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.204821 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.225908 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.245575 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.265674 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.284876 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.305159 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.325162 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.344630 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.365766 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.386140 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.405658 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.425250 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.445785 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.465202 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.485642 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.505787 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.525712 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.545539 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.565587 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.585615 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.626819 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30f89119-a5f1-41eb-822c-fdd243818a4b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.626910 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.626950 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30f89119-a5f1-41eb-822c-fdd243818a4b-config\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.627033 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-encryption-config\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.627094 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439f7d1e-4721-42db-90a8-d9ce7bfb2302-config\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.627270 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/439f7d1e-4721-42db-90a8-d9ce7bfb2302-serving-cert\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.627367 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.627453 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-audit-policies\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.627504 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/547f6e0d-0e57-4ebd-94bb-1cff5d6fedef-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kd826\" (UID: \"547f6e0d-0e57-4ebd-94bb-1cff5d6fedef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.627544 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-client\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.627591 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-serving-cert\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.627621 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a6fca6-52bf-4568-baa8-5bbcd0904722-cert\") pod \"ingress-canary-gjnb4\" (UID: \"74a6fca6-52bf-4568-baa8-5bbcd0904722\") " pod="openshift-ingress-canary/ingress-canary-gjnb4" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.628007 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30f89119-a5f1-41eb-822c-fdd243818a4b-config\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.628634 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439f7d1e-4721-42db-90a8-d9ce7bfb2302-config\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.629105 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.629121 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.629168 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-audit-policies\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.630725 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-encryption-config\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.631563 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30f89119-a5f1-41eb-822c-fdd243818a4b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.633636 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/547f6e0d-0e57-4ebd-94bb-1cff5d6fedef-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kd826\" (UID: \"547f6e0d-0e57-4ebd-94bb-1cff5d6fedef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.633757 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-serving-cert\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.634000 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a6fca6-52bf-4568-baa8-5bbcd0904722-cert\") pod \"ingress-canary-gjnb4\" (UID: \"74a6fca6-52bf-4568-baa8-5bbcd0904722\") " pod="openshift-ingress-canary/ingress-canary-gjnb4" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.634565 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-etcd-client\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.635500 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/439f7d1e-4721-42db-90a8-d9ce7bfb2302-serving-cert\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.646797 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfllt\" (UniqueName: \"kubernetes.io/projected/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-kube-api-access-rfllt\") pod \"console-f9d7485db-pdprk\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.675784 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7pjw\" (UniqueName: \"kubernetes.io/projected/319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03-kube-api-access-t7pjw\") pod \"etcd-operator-b45778765-67kvn\" (UID: \"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.682318 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.684593 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vznsz\" (UniqueName: \"kubernetes.io/projected/09c34009-3606-4b93-9f5f-c8a478aee354-kube-api-access-vznsz\") pod \"router-default-5444994796-82czk\" (UID: \"09c34009-3606-4b93-9f5f-c8a478aee354\") " pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.700887 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccg6s\" (UniqueName: \"kubernetes.io/projected/b6686556-54b1-4232-a5ec-7bacd966ff86-kube-api-access-ccg6s\") pod \"downloads-7954f5f757-2pf5t\" (UID: \"b6686556-54b1-4232-a5ec-7bacd966ff86\") " pod="openshift-console/downloads-7954f5f757-2pf5t" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.722732 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scc9c\" (UniqueName: \"kubernetes.io/projected/0c505da0-8f2d-4cb8-b93d-9351ea9f82ec-kube-api-access-scc9c\") pod \"catalog-operator-68c6474976-pzvqw\" (UID: \"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.740098 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf75w\" (UniqueName: \"kubernetes.io/projected/371d4f01-4337-4da2-8e72-7a79d2a7f98c-kube-api-access-cf75w\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksgsk\" (UID: \"371d4f01-4337-4da2-8e72-7a79d2a7f98c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.758842 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l87vz\" (UniqueName: \"kubernetes.io/projected/d55c18b9-bb80-428c-95ea-f21c6b0694e4-kube-api-access-l87vz\") pod \"apiserver-76f77b778f-qvn4k\" (UID: \"d55c18b9-bb80-428c-95ea-f21c6b0694e4\") " pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.763167 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.790676 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4wl8\" (UniqueName: \"kubernetes.io/projected/30b30333-fbb1-476f-8bf2-146f2ee696a7-kube-api-access-x4wl8\") pod \"packageserver-d55dfcdfc-268bv\" (UID: \"30b30333-fbb1-476f-8bf2-146f2ee696a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.799777 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzlbd\" (UniqueName: \"kubernetes.io/projected/7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263-kube-api-access-xzlbd\") pod \"dns-operator-744455d44c-l75ns\" (UID: \"7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263\") " pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.847560 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzzwr\" (UniqueName: \"kubernetes.io/projected/4ee3b773-c658-424e-ab4d-c6ba3e866ce2-kube-api-access-pzzwr\") pod \"machine-config-controller-84d6567774-gb2gn\" (UID: \"4ee3b773-c658-424e-ab4d-c6ba3e866ce2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.851452 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.851516 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kxt7\" (UniqueName: \"kubernetes.io/projected/8a01a344-a2a2-4d3c-9bc3-5e911936606c-kube-api-access-6kxt7\") pod \"oauth-openshift-558db77b4-fjgbs\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.861554 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqrjc\" (UniqueName: \"kubernetes.io/projected/17977cba-cde5-4d42-9c64-1c37568db595-kube-api-access-tqrjc\") pod \"machine-config-operator-74547568cd-rm7qh\" (UID: \"17977cba-cde5-4d42-9c64-1c37568db595\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.864122 4965 request.go:700] Waited for 1.889815387s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/serviceaccounts/olm-operator-serviceaccount/token Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.872465 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.876755 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.891025 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77rvq\" (UniqueName: \"kubernetes.io/projected/d97be63b-6ce3-44ae-b820-25260bf392bf-kube-api-access-77rvq\") pod \"package-server-manager-789f6589d5-2t52f\" (UID: \"d97be63b-6ce3-44ae-b820-25260bf392bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.903796 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km5sj\" (UniqueName: \"kubernetes.io/projected/78e98b3d-733f-4b7a-abcf-950d6870c04f-kube-api-access-km5sj\") pod \"route-controller-manager-6576b87f9c-ndw52\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.921822 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq6vz\" (UniqueName: \"kubernetes.io/projected/8944b4a9-8e7d-4a3a-a526-11a31e795453-kube-api-access-lq6vz\") pod \"olm-operator-6b444d44fb-frm99\" (UID: \"8944b4a9-8e7d-4a3a-a526-11a31e795453\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.924342 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2pf5t" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.941618 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.945322 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pdprk"] Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.947727 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8wgm\" (UniqueName: \"kubernetes.io/projected/317839b7-786d-4e93-8b37-4dd23e4a5032-kube-api-access-x8wgm\") pod \"machine-api-operator-5694c8668f-b6gl2\" (UID: \"317839b7-786d-4e93-8b37-4dd23e4a5032\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.951432 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.956635 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.962904 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.972510 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgbdl\" (UniqueName: \"kubernetes.io/projected/77477a76-df54-4755-89b0-9b2ec40e098d-kube-api-access-lgbdl\") pod \"controller-manager-879f6c89f-8rgvb\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:38 crc kubenswrapper[4965]: I1125 15:06:38.982860 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grzj4\" (UniqueName: \"kubernetes.io/projected/8252e47a-59aa-4a86-8f97-d12cc21fc6d8-kube-api-access-grzj4\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmhs2\" (UID: \"8252e47a-59aa-4a86-8f97-d12cc21fc6d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.013373 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p52vj\" (UniqueName: \"kubernetes.io/projected/9953c224-6f16-4a0d-ad9a-3ea1e4914499-kube-api-access-p52vj\") pod \"openshift-controller-manager-operator-756b6f6bc6-g9mfz\" (UID: \"9953c224-6f16-4a0d-ad9a-3ea1e4914499\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.025637 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs5mv\" (UniqueName: \"kubernetes.io/projected/09e6c287-3ef1-4ea0-87c9-54b59acfc772-kube-api-access-fs5mv\") pod \"authentication-operator-69f744f599-jv944\" (UID: \"09e6c287-3ef1-4ea0-87c9-54b59acfc772\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.039486 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.047767 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.050759 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.061963 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.078938 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp8f4\" (UniqueName: \"kubernetes.io/projected/8debd4d2-5319-4aa1-bd2d-aee777eba0ba-kube-api-access-wp8f4\") pod \"cluster-image-registry-operator-dc59b4c8b-lh8z6\" (UID: \"8debd4d2-5319-4aa1-bd2d-aee777eba0ba\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.087986 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91f26391-3d69-4625-a15b-286b29e14161-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tbt5x\" (UID: \"91f26391-3d69-4625-a15b-286b29e14161\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.117426 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.127425 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.136188 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc4vw\" (UniqueName: \"kubernetes.io/projected/ac3b054d-ca2f-4ca3-a3fa-6772cad2a377-kube-api-access-rc4vw\") pod \"machine-approver-56656f9798-6g7bs\" (UID: \"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.136457 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc56f\" (UniqueName: \"kubernetes.io/projected/107eeea6-61f3-4cc7-b51c-f3100d84f707-kube-api-access-tc56f\") pod \"openshift-apiserver-operator-796bbdcf4f-bxkng\" (UID: \"107eeea6-61f3-4cc7-b51c-f3100d84f707\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.141505 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.149929 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.157672 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr852\" (UniqueName: \"kubernetes.io/projected/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-kube-api-access-dr852\") pod \"marketplace-operator-79b997595-nld56\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.165759 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwb8x\" (UniqueName: \"kubernetes.io/projected/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-kube-api-access-hwb8x\") pod \"collect-profiles-29401380-dlnxs\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.176396 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.181282 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgm5\" (UniqueName: \"kubernetes.io/projected/423ab6bf-9cad-43cd-af44-e0cee05b262b-kube-api-access-rkgm5\") pod \"console-operator-58897d9998-5mpvj\" (UID: \"423ab6bf-9cad-43cd-af44-e0cee05b262b\") " pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.183875 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.186016 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.191549 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.200645 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.209746 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.213245 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4fxm\" (UniqueName: \"kubernetes.io/projected/573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390-kube-api-access-d4fxm\") pod \"service-ca-9c57cc56f-7hnpl\" (UID: \"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390\") " pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.219381 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.227238 4965 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.232370 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.246039 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.246090 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.268399 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.268718 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.283567 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.285932 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.301092 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.307727 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.313099 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.326097 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.345380 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.370176 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.392020 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2pf5t"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.406940 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whbrg\" (UniqueName: \"kubernetes.io/projected/ed9961f3-d3a0-47aa-a9c4-e311468d76ff-kube-api-access-whbrg\") pod \"apiserver-7bbb656c7d-6xcqz\" (UID: \"ed9961f3-d3a0-47aa-a9c4-e311468d76ff\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.423931 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qvn4k"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.438416 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnlzk\" (UniqueName: \"kubernetes.io/projected/ded91b12-10e5-4c46-aaa7-e54c34072789-kube-api-access-hnlzk\") pod \"multus-admission-controller-857f4d67dd-lcspp\" (UID: \"ded91b12-10e5-4c46-aaa7-e54c34072789\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.445148 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p822h\" (UniqueName: \"kubernetes.io/projected/547f6e0d-0e57-4ebd-94bb-1cff5d6fedef-kube-api-access-p822h\") pod \"cluster-samples-operator-665b6dd947-kd826\" (UID: \"547f6e0d-0e57-4ebd-94bb-1cff5d6fedef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.460942 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.465849 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzv29\" (UniqueName: \"kubernetes.io/projected/439f7d1e-4721-42db-90a8-d9ce7bfb2302-kube-api-access-nzv29\") pod \"service-ca-operator-777779d784-kp5tp\" (UID: \"439f7d1e-4721-42db-90a8-d9ce7bfb2302\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.473326 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.491397 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9f89ef7-da88-4a5b-a0ca-5474493bf82f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xtpw9\" (UID: \"a9f89ef7-da88-4a5b-a0ca-5474493bf82f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.501704 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv"] Nov 25 15:06:39 crc kubenswrapper[4965]: W1125 15:06:39.508765 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17977cba_cde5_4d42_9c64_1c37568db595.slice/crio-a911234d03011eb179cedb003d0912e9a56fe573570ffbbc9fb442ae15371ece WatchSource:0}: Error finding container a911234d03011eb179cedb003d0912e9a56fe573570ffbbc9fb442ae15371ece: Status 404 returned error can't find the container with id a911234d03011eb179cedb003d0912e9a56fe573570ffbbc9fb442ae15371ece Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.509751 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thmhq\" (UniqueName: \"kubernetes.io/projected/d3d12281-f14b-432b-bc6a-7d8d4b9e933e-kube-api-access-thmhq\") pod \"openshift-config-operator-7777fb866f-2b4mm\" (UID: \"d3d12281-f14b-432b-bc6a-7d8d4b9e933e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.514640 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" event={"ID":"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377","Type":"ContainerStarted","Data":"967f9cf0db96b6b003a55c70d2f2f16c28d1e7bffcc8a85e90e2dfb41888613d"} Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.527030 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.532976 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-82czk" event={"ID":"09c34009-3606-4b93-9f5f-c8a478aee354","Type":"ContainerStarted","Data":"83e1b5ac02f27c9140375b38a8033abc57e47ea609c468d97df72e672976a226"} Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.533018 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-82czk" event={"ID":"09c34009-3606-4b93-9f5f-c8a478aee354","Type":"ContainerStarted","Data":"09c4574bd88db2ea6fbdc50da2b045c84b7e01ec3876c6d7a4b0885120ad4b68"} Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.533762 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30f89119-a5f1-41eb-822c-fdd243818a4b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wkf6l\" (UID: \"30f89119-a5f1-41eb-822c-fdd243818a4b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.536185 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-67kvn"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.542613 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2pf5t" event={"ID":"b6686556-54b1-4232-a5ec-7bacd966ff86","Type":"ContainerStarted","Data":"34df87ed6fba088451f5a2a73975c73c0544376acffae2f21ed2541fef0f5600"} Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.548117 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jzf\" (UniqueName: \"kubernetes.io/projected/6e339e05-6937-4fca-b8f6-060917c05b2c-kube-api-access-46jzf\") pod \"migrator-59844c95c7-wb26t\" (UID: \"6e339e05-6937-4fca-b8f6-060917c05b2c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.549226 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" event={"ID":"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec","Type":"ContainerStarted","Data":"10db808c993bcff1234ecd6e4696805a431c8ef18366b8d2e7f22790d2b73670"} Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.549249 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" event={"ID":"0c505da0-8f2d-4cb8-b93d-9351ea9f82ec","Type":"ContainerStarted","Data":"b0627573ecb5dc86ea508ff563f23dd96f610e511b9c828ecfeee686510f941a"} Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.549831 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.558131 4965 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-pzvqw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.558170 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" podUID="0c505da0-8f2d-4cb8-b93d-9351ea9f82ec" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.561696 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pdprk" event={"ID":"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0","Type":"ContainerStarted","Data":"8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067"} Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.561733 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pdprk" event={"ID":"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0","Type":"ContainerStarted","Data":"865c8ba5942ac9f1ed321c656b1ec762a15c73178aad0fc7da19476289a38a21"} Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.572759 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.585024 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.586500 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs8vn\" (UniqueName: \"kubernetes.io/projected/74a6fca6-52bf-4568-baa8-5bbcd0904722-kube-api-access-qs8vn\") pod \"ingress-canary-gjnb4\" (UID: \"74a6fca6-52bf-4568-baa8-5bbcd0904722\") " pod="openshift-ingress-canary/ingress-canary-gjnb4" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.593118 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.605035 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgr2\" (UniqueName: \"kubernetes.io/projected/ef9cb97a-abf5-477c-bcde-ed43ba1a80fb-kube-api-access-xlgr2\") pod \"ingress-operator-5b745b69d9-lcr9f\" (UID: \"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.612904 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.625573 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.628747 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.640923 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.641524 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.647682 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8c017b51-468b-4ff4-9524-1e4349a54323-ca-trust-extracted\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.647716 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8c017b51-468b-4ff4-9524-1e4349a54323-installation-pull-secrets\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.647735 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdx55\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-kube-api-access-fdx55\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.647792 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.647808 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-registry-tls\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.647855 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-trusted-ca\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.648023 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-bound-sa-token\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.648115 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-registry-certificates\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: E1125 15:06:39.648329 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:40.148302371 +0000 UTC m=+145.115896177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.655351 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.663368 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.669138 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gjnb4" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749202 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749485 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59cdf509-5573-4cea-acca-7a5830b58bcf-metrics-tls\") pod \"dns-default-mlsxj\" (UID: \"59cdf509-5573-4cea-acca-7a5830b58bcf\") " pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749543 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p7xr\" (UniqueName: \"kubernetes.io/projected/59cdf509-5573-4cea-acca-7a5830b58bcf-kube-api-access-2p7xr\") pod \"dns-default-mlsxj\" (UID: \"59cdf509-5573-4cea-acca-7a5830b58bcf\") " pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749596 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-plugins-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749663 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/920d465f-64d8-4954-acfe-4a12f1ea7739-node-bootstrap-token\") pod \"machine-config-server-lxkkv\" (UID: \"920d465f-64d8-4954-acfe-4a12f1ea7739\") " pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749695 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j44j4\" (UniqueName: \"kubernetes.io/projected/920d465f-64d8-4954-acfe-4a12f1ea7739-kube-api-access-j44j4\") pod \"machine-config-server-lxkkv\" (UID: \"920d465f-64d8-4954-acfe-4a12f1ea7739\") " pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749737 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-registry-tls\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749754 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-registration-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749768 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2xjk\" (UniqueName: \"kubernetes.io/projected/50a60857-8df9-45be-91b8-a41878677884-kube-api-access-d2xjk\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749818 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-trusted-ca\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.749910 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-bound-sa-token\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.750045 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-registry-certificates\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.750107 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-csi-data-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.750288 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59cdf509-5573-4cea-acca-7a5830b58bcf-config-volume\") pod \"dns-default-mlsxj\" (UID: \"59cdf509-5573-4cea-acca-7a5830b58bcf\") " pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.750446 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8c017b51-468b-4ff4-9524-1e4349a54323-ca-trust-extracted\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.750527 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-socket-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.750544 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8c017b51-468b-4ff4-9524-1e4349a54323-installation-pull-secrets\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.750570 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdx55\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-kube-api-access-fdx55\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.750605 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/920d465f-64d8-4954-acfe-4a12f1ea7739-certs\") pod \"machine-config-server-lxkkv\" (UID: \"920d465f-64d8-4954-acfe-4a12f1ea7739\") " pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.750667 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-mountpoint-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: E1125 15:06:39.751094 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:40.251076455 +0000 UTC m=+145.218670211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.766557 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8c017b51-468b-4ff4-9524-1e4349a54323-ca-trust-extracted\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.781199 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-registry-certificates\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.782076 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-trusted-ca\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.785637 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rgvb"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.793016 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-registry-tls\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.793300 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fjgbs"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.794655 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8c017b51-468b-4ff4-9524-1e4349a54323-installation-pull-secrets\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.802035 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.803146 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-bound-sa-token\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.804154 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l75ns"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.829624 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdx55\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-kube-api-access-fdx55\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.852799 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-socket-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.852843 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/920d465f-64d8-4954-acfe-4a12f1ea7739-certs\") pod \"machine-config-server-lxkkv\" (UID: \"920d465f-64d8-4954-acfe-4a12f1ea7739\") " pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.852901 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-mountpoint-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.852928 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59cdf509-5573-4cea-acca-7a5830b58bcf-metrics-tls\") pod \"dns-default-mlsxj\" (UID: \"59cdf509-5573-4cea-acca-7a5830b58bcf\") " pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.852952 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p7xr\" (UniqueName: \"kubernetes.io/projected/59cdf509-5573-4cea-acca-7a5830b58bcf-kube-api-access-2p7xr\") pod \"dns-default-mlsxj\" (UID: \"59cdf509-5573-4cea-acca-7a5830b58bcf\") " pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.852990 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-plugins-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.853012 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/920d465f-64d8-4954-acfe-4a12f1ea7739-node-bootstrap-token\") pod \"machine-config-server-lxkkv\" (UID: \"920d465f-64d8-4954-acfe-4a12f1ea7739\") " pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.853045 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j44j4\" (UniqueName: \"kubernetes.io/projected/920d465f-64d8-4954-acfe-4a12f1ea7739-kube-api-access-j44j4\") pod \"machine-config-server-lxkkv\" (UID: \"920d465f-64d8-4954-acfe-4a12f1ea7739\") " pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.853069 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.853092 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-registration-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.853113 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2xjk\" (UniqueName: \"kubernetes.io/projected/50a60857-8df9-45be-91b8-a41878677884-kube-api-access-d2xjk\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.853171 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-csi-data-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.853204 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59cdf509-5573-4cea-acca-7a5830b58bcf-config-volume\") pod \"dns-default-mlsxj\" (UID: \"59cdf509-5573-4cea-acca-7a5830b58bcf\") " pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.853941 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-mountpoint-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.853946 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-plugins-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.854048 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-socket-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.855120 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59cdf509-5573-4cea-acca-7a5830b58bcf-config-volume\") pod \"dns-default-mlsxj\" (UID: \"59cdf509-5573-4cea-acca-7a5830b58bcf\") " pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.856012 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-registration-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.856365 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/50a60857-8df9-45be-91b8-a41878677884-csi-data-dir\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: E1125 15:06:39.856521 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:40.356506293 +0000 UTC m=+145.324100119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.857709 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59cdf509-5573-4cea-acca-7a5830b58bcf-metrics-tls\") pod \"dns-default-mlsxj\" (UID: \"59cdf509-5573-4cea-acca-7a5830b58bcf\") " pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.864578 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/920d465f-64d8-4954-acfe-4a12f1ea7739-node-bootstrap-token\") pod \"machine-config-server-lxkkv\" (UID: \"920d465f-64d8-4954-acfe-4a12f1ea7739\") " pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.877878 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b6gl2"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.880766 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/920d465f-64d8-4954-acfe-4a12f1ea7739-certs\") pod \"machine-config-server-lxkkv\" (UID: \"920d465f-64d8-4954-acfe-4a12f1ea7739\") " pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.915233 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p7xr\" (UniqueName: \"kubernetes.io/projected/59cdf509-5573-4cea-acca-7a5830b58bcf-kube-api-access-2p7xr\") pod \"dns-default-mlsxj\" (UID: \"59cdf509-5573-4cea-acca-7a5830b58bcf\") " pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.943218 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.945357 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.945396 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.945952 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52"] Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.948754 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2xjk\" (UniqueName: \"kubernetes.io/projected/50a60857-8df9-45be-91b8-a41878677884-kube-api-access-d2xjk\") pod \"csi-hostpathplugin-t99cv\" (UID: \"50a60857-8df9-45be-91b8-a41878677884\") " pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.955456 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:39 crc kubenswrapper[4965]: E1125 15:06:39.955888 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:40.455872453 +0000 UTC m=+145.423466199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.958786 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j44j4\" (UniqueName: \"kubernetes.io/projected/920d465f-64d8-4954-acfe-4a12f1ea7739-kube-api-access-j44j4\") pod \"machine-config-server-lxkkv\" (UID: \"920d465f-64d8-4954-acfe-4a12f1ea7739\") " pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.994311 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-t99cv" Nov 25 15:06:39 crc kubenswrapper[4965]: I1125 15:06:39.994499 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz"] Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.011214 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.014226 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lxkkv" Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.057642 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:40 crc kubenswrapper[4965]: E1125 15:06:40.058029 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:40.558016151 +0000 UTC m=+145.525609897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:40 crc kubenswrapper[4965]: W1125 15:06:40.103635 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd97be63b_6ce3_44ae_b820_25260bf392bf.slice/crio-0950c338a2e71904863981557db2c99aa1f78a9daf7be3a79319801e332791f1 WatchSource:0}: Error finding container 0950c338a2e71904863981557db2c99aa1f78a9daf7be3a79319801e332791f1: Status 404 returned error can't find the container with id 0950c338a2e71904863981557db2c99aa1f78a9daf7be3a79319801e332791f1 Nov 25 15:06:40 crc kubenswrapper[4965]: W1125 15:06:40.148040 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77477a76_df54_4755_89b0_9b2ec40e098d.slice/crio-8b54bc2783c91d73c9fbd0507c65f19d9f00cb999e1d93326ebbd4a7313b6c3b WatchSource:0}: Error finding container 8b54bc2783c91d73c9fbd0507c65f19d9f00cb999e1d93326ebbd4a7313b6c3b: Status 404 returned error can't find the container with id 8b54bc2783c91d73c9fbd0507c65f19d9f00cb999e1d93326ebbd4a7313b6c3b Nov 25 15:06:40 crc kubenswrapper[4965]: W1125 15:06:40.162250 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a01a344_a2a2_4d3c_9bc3_5e911936606c.slice/crio-de8828ac02ad923511d9e829e6984e10bd0ea4ed7731c269483f7316ebb1fdf8 WatchSource:0}: Error finding container de8828ac02ad923511d9e829e6984e10bd0ea4ed7731c269483f7316ebb1fdf8: Status 404 returned error can't find the container with id de8828ac02ad923511d9e829e6984e10bd0ea4ed7731c269483f7316ebb1fdf8 Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.162500 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:40 crc kubenswrapper[4965]: E1125 15:06:40.162702 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:40.662683427 +0000 UTC m=+145.630277173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.162859 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:40 crc kubenswrapper[4965]: E1125 15:06:40.163164 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:40.66315242 +0000 UTC m=+145.630746166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.264622 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:40 crc kubenswrapper[4965]: E1125 15:06:40.269913 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:40.769877923 +0000 UTC m=+145.737471759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.351998 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jv944"] Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.353747 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs"] Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.355767 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6"] Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.357219 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nld56"] Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.366504 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:40 crc kubenswrapper[4965]: E1125 15:06:40.366826 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:40.866814797 +0000 UTC m=+145.834408533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.467129 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:40 crc kubenswrapper[4965]: E1125 15:06:40.467620 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:40.967601987 +0000 UTC m=+145.935195733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.514202 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng"] Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.533309 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99"] Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.538959 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2"] Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.573817 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:40 crc kubenswrapper[4965]: E1125 15:06:40.574189 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.074178356 +0000 UTC m=+146.041772102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.609259 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" event={"ID":"371d4f01-4337-4da2-8e72-7a79d2a7f98c","Type":"ContainerStarted","Data":"47eec918fa6f9cd3c003c529f937919d067233e25f98b1342b6b1475be3e8c4f"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.609303 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" event={"ID":"371d4f01-4337-4da2-8e72-7a79d2a7f98c","Type":"ContainerStarted","Data":"e7e3a47416f3982a56f859102e9ce4eee5b6cdf5c103bfb05b2f3a7a70b8b60f"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.653295 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" event={"ID":"d97be63b-6ce3-44ae-b820-25260bf392bf","Type":"ContainerStarted","Data":"0950c338a2e71904863981557db2c99aa1f78a9daf7be3a79319801e332791f1"} Nov 25 15:06:40 crc kubenswrapper[4965]: W1125 15:06:40.659187 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod920d465f_64d8_4954_acfe_4a12f1ea7739.slice/crio-39bb834e22971907af7d9f97bbcc6e9e99148a489a678034be512eef8bc7273d WatchSource:0}: Error finding container 39bb834e22971907af7d9f97bbcc6e9e99148a489a678034be512eef8bc7273d: Status 404 returned error can't find the container with id 39bb834e22971907af7d9f97bbcc6e9e99148a489a678034be512eef8bc7273d Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.659639 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" event={"ID":"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1","Type":"ContainerStarted","Data":"224577b33633e16adf2d5e2bf418384650794cbe4d5d0ba3d38ecab52a81bf32"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.684578 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:40 crc kubenswrapper[4965]: E1125 15:06:40.685834 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.185817825 +0000 UTC m=+146.153411571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.698861 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" event={"ID":"17977cba-cde5-4d42-9c64-1c37568db595","Type":"ContainerStarted","Data":"0810356d02b6a03691cc6648c9f471aebf3329702d055e17ce74d0171d098322"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.698908 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" event={"ID":"17977cba-cde5-4d42-9c64-1c37568db595","Type":"ContainerStarted","Data":"a911234d03011eb179cedb003d0912e9a56fe573570ffbbc9fb442ae15371ece"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.722166 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" event={"ID":"30b30333-fbb1-476f-8bf2-146f2ee696a7","Type":"ContainerStarted","Data":"5929de30c0d5de304aa12082649d5185638bd5d72b5557ed7b9f604fab0df43f"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.742212 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" event={"ID":"8a01a344-a2a2-4d3c-9bc3-5e911936606c","Type":"ContainerStarted","Data":"de8828ac02ad923511d9e829e6984e10bd0ea4ed7731c269483f7316ebb1fdf8"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.815136 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:40 crc kubenswrapper[4965]: E1125 15:06:40.817707 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.317683078 +0000 UTC m=+146.285276824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.838006 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" event={"ID":"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a","Type":"ContainerStarted","Data":"2a740e80da2e6c91c3f1fd75501970239d353d356f135be04cc628654076d685"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.841382 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7hnpl"] Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.841673 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" event={"ID":"d55c18b9-bb80-428c-95ea-f21c6b0694e4","Type":"ContainerStarted","Data":"a6089782ec3f3e0f63fcc79179994e0822de9aa90fb53cbe84da5c7681838558"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.841765 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5mpvj"] Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.841849 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x"] Nov 25 15:06:40 crc kubenswrapper[4965]: W1125 15:06:40.847979 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod423ab6bf_9cad_43cd_af44_e0cee05b262b.slice/crio-c2c86024b75564f92dcc1eebbfb7e6467c60d1a142f1ab15c596c619e03699d9 WatchSource:0}: Error finding container c2c86024b75564f92dcc1eebbfb7e6467c60d1a142f1ab15c596c619e03699d9: Status 404 returned error can't find the container with id c2c86024b75564f92dcc1eebbfb7e6467c60d1a142f1ab15c596c619e03699d9 Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.873145 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" event={"ID":"09e6c287-3ef1-4ea0-87c9-54b59acfc772","Type":"ContainerStarted","Data":"da7ab12b6be56c095409b02b2c42d402878f99f8616fc13486044f37530b6bdc"} Nov 25 15:06:40 crc kubenswrapper[4965]: W1125 15:06:40.884049 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573dcfaf_c9ea_4c83_b2b1_8c54c2f7d390.slice/crio-98d61173b0e4b6e551d808a6da5ab58ae1b77ed2164fd9cd89e2040e5a0389b3 WatchSource:0}: Error finding container 98d61173b0e4b6e551d808a6da5ab58ae1b77ed2164fd9cd89e2040e5a0389b3: Status 404 returned error can't find the container with id 98d61173b0e4b6e551d808a6da5ab58ae1b77ed2164fd9cd89e2040e5a0389b3 Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.885137 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2pf5t" event={"ID":"b6686556-54b1-4232-a5ec-7bacd966ff86","Type":"ContainerStarted","Data":"e2b2658f0fe407838b3a77118790b74e860fe3f939adc00d34a1ab8392f7cc34"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.885608 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2pf5t" Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.890998 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-2pf5t container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.891054 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2pf5t" podUID="b6686556-54b1-4232-a5ec-7bacd966ff86" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.893702 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" event={"ID":"9953c224-6f16-4a0d-ad9a-3ea1e4914499","Type":"ContainerStarted","Data":"667fc0f38f82ce9d26b9543a616ae6a30f6301cae93def296cf0c87ad13f7c84"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.899868 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" event={"ID":"78e98b3d-733f-4b7a-abcf-950d6870c04f","Type":"ContainerStarted","Data":"9179ed7262685d2e4053cf19623860c7f497c9d073c31f73b4bf7fec48e88d99"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.916479 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:40 crc kubenswrapper[4965]: E1125 15:06:40.916720 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.416704419 +0000 UTC m=+146.384298165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.917644 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" event={"ID":"4ee3b773-c658-424e-ab4d-c6ba3e866ce2","Type":"ContainerStarted","Data":"acf25a31c768775bf269763914568d57195153e84c1dce2ed539afe1ae46aca8"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.917681 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" event={"ID":"4ee3b773-c658-424e-ab4d-c6ba3e866ce2","Type":"ContainerStarted","Data":"063b9d47ece073e81e47b0b2a92edfd397072d7f4f0c275804d088324ba11f38"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.937604 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" event={"ID":"77477a76-df54-4755-89b0-9b2ec40e098d","Type":"ContainerStarted","Data":"8b54bc2783c91d73c9fbd0507c65f19d9f00cb999e1d93326ebbd4a7313b6c3b"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.940600 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" event={"ID":"7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263","Type":"ContainerStarted","Data":"223a78d9561e5e777888bfbb882e35ea20f5e1c376bb56c060731088d35cdbee"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.950824 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:40 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:40 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:40 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.951247 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.955422 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" event={"ID":"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377","Type":"ContainerStarted","Data":"2bf06290c6cf288ac92629db08bf4d51c0b9eab3b3ecf03bc7b13f6c7fe26b49"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.956796 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" event={"ID":"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03","Type":"ContainerStarted","Data":"d4a61a603e5769c14e6231accb5984ae577a8f6215515a92fd1f06e972e087b9"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.958227 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" event={"ID":"317839b7-786d-4e93-8b37-4dd23e4a5032","Type":"ContainerStarted","Data":"1718bc40827fe632945096735d5bebefe6c5480c7c686dcbf4d0e11be45d5003"} Nov 25 15:06:40 crc kubenswrapper[4965]: I1125 15:06:40.965247 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.023475 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pzvqw" podStartSLOduration=124.023457463 podStartE2EDuration="2m4.023457463s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:40.983259899 +0000 UTC m=+145.950853645" watchObservedRunningTime="2025-11-25 15:06:41.023457463 +0000 UTC m=+145.991051209" Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.025440 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.026331 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.526316152 +0000 UTC m=+146.493909898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.052823 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.059588 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.061937 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.079580 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gjnb4"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.079625 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.090843 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lcspp"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.094253 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.095857 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t99cv"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.096765 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-82czk" podStartSLOduration=124.096745948 podStartE2EDuration="2m4.096745948s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:41.096329346 +0000 UTC m=+146.063923092" watchObservedRunningTime="2025-11-25 15:06:41.096745948 +0000 UTC m=+146.064339694" Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.126238 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.126361 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.626336661 +0000 UTC m=+146.593930407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.126550 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.127053 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.62704481 +0000 UTC m=+146.594638556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.177787 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.192225 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.196010 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mlsxj"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.197635 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.219324 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-pdprk" podStartSLOduration=124.219305786 podStartE2EDuration="2m4.219305786s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:41.216426717 +0000 UTC m=+146.184020463" watchObservedRunningTime="2025-11-25 15:06:41.219305786 +0000 UTC m=+146.186899532" Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.227295 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.227462 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.727438309 +0000 UTC m=+146.695032055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.227808 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.228583 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.72854277 +0000 UTC m=+146.696136516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: W1125 15:06:41.253581 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded9961f3_d3a0_47aa_a9c4_e311468d76ff.slice/crio-7d367afa3d88e49a8307b5789b0906d83d2db21774f8410614d082ed1eeb0680 WatchSource:0}: Error finding container 7d367afa3d88e49a8307b5789b0906d83d2db21774f8410614d082ed1eeb0680: Status 404 returned error can't find the container with id 7d367afa3d88e49a8307b5789b0906d83d2db21774f8410614d082ed1eeb0680 Nov 25 15:06:41 crc kubenswrapper[4965]: W1125 15:06:41.311839 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod439f7d1e_4721_42db_90a8_d9ce7bfb2302.slice/crio-fe9ff791559844b2ff07745d182b34581a306483780e5949a9fd1ed7a3ae02a8 WatchSource:0}: Error finding container fe9ff791559844b2ff07745d182b34581a306483780e5949a9fd1ed7a3ae02a8: Status 404 returned error can't find the container with id fe9ff791559844b2ff07745d182b34581a306483780e5949a9fd1ed7a3ae02a8 Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.329526 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.329874 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.829858425 +0000 UTC m=+146.797452171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: W1125 15:06:41.385395 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30f89119_a5f1_41eb_822c_fdd243818a4b.slice/crio-a2aa378ce672251143a710f5f372f5436d8ee1376752bf7f3d0991a9afe2a426 WatchSource:0}: Error finding container a2aa378ce672251143a710f5f372f5436d8ee1376752bf7f3d0991a9afe2a426: Status 404 returned error can't find the container with id a2aa378ce672251143a710f5f372f5436d8ee1376752bf7f3d0991a9afe2a426 Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.434767 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.435701 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:41.935686543 +0000 UTC m=+146.903280289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.515184 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2pf5t" podStartSLOduration=124.515168837 podStartE2EDuration="2m4.515168837s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:41.513879921 +0000 UTC m=+146.481473667" watchObservedRunningTime="2025-11-25 15:06:41.515168837 +0000 UTC m=+146.482762583" Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.536139 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.536278 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.036251396 +0000 UTC m=+147.003845142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.536369 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.536733 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.0367238 +0000 UTC m=+147.004317636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.637007 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.637350 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.137329035 +0000 UTC m=+147.104922801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.739181 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.739553 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.239537333 +0000 UTC m=+147.207131079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.840469 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.840850 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.340834157 +0000 UTC m=+147.308427903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.902951 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6blkb"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.903931 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.911786 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6blkb"] Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.926549 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.945329 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:41 crc kubenswrapper[4965]: E1125 15:06:41.946132 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.446119911 +0000 UTC m=+147.413713647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.948330 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:41 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:41 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:41 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:41 crc kubenswrapper[4965]: I1125 15:06:41.948578 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.005157 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" event={"ID":"30f89119-a5f1-41eb-822c-fdd243818a4b","Type":"ContainerStarted","Data":"a2aa378ce672251143a710f5f372f5436d8ee1376752bf7f3d0991a9afe2a426"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.025826 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" event={"ID":"d3d12281-f14b-432b-bc6a-7d8d4b9e933e","Type":"ContainerStarted","Data":"cfc8846f09bc9d96e3ce729f3c141cc0a8c2c0b6053cd1ee5fa229402bec6e05"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.046596 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:42 crc kubenswrapper[4965]: E1125 15:06:42.048796 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.548776491 +0000 UTC m=+147.516370237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.055791 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-utilities\") pod \"community-operators-6blkb\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.055875 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-catalog-content\") pod \"community-operators-6blkb\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.055943 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.056007 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpjkd\" (UniqueName: \"kubernetes.io/projected/79a7c128-3c65-491c-95b6-52337183df64-kube-api-access-xpjkd\") pod \"community-operators-6blkb\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:42 crc kubenswrapper[4965]: E1125 15:06:42.056357 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.55634539 +0000 UTC m=+147.523939136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.083772 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" event={"ID":"8944b4a9-8e7d-4a3a-a526-11a31e795453","Type":"ContainerStarted","Data":"65893cceba5bd036b0dc8c739a59be4dcff0d402365a605f2e2988d05ed0fae9"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.101542 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" event={"ID":"d97be63b-6ce3-44ae-b820-25260bf392bf","Type":"ContainerStarted","Data":"8d73ecf3443caa8832dafec6972f8d837b956353403862fc7d8d963da4820aca"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.130219 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s9jt9"] Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.132708 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.144617 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" event={"ID":"ed9961f3-d3a0-47aa-a9c4-e311468d76ff","Type":"ContainerStarted","Data":"7d367afa3d88e49a8307b5789b0906d83d2db21774f8410614d082ed1eeb0680"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.150196 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.155707 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9jt9"] Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.168504 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.168670 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-utilities\") pod \"community-operators-6blkb\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.168693 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-catalog-content\") pod \"community-operators-6blkb\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.168746 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpjkd\" (UniqueName: \"kubernetes.io/projected/79a7c128-3c65-491c-95b6-52337183df64-kube-api-access-xpjkd\") pod \"community-operators-6blkb\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:42 crc kubenswrapper[4965]: E1125 15:06:42.169137 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.669121889 +0000 UTC m=+147.636715635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.169430 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-utilities\") pod \"community-operators-6blkb\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.194588 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-catalog-content\") pod \"community-operators-6blkb\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.206618 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t" event={"ID":"6e339e05-6937-4fca-b8f6-060917c05b2c","Type":"ContainerStarted","Data":"0ff314837188365ed5221f854524ff29af688977a85f77b329e4bc53977c1b7b"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.228695 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" event={"ID":"09e6c287-3ef1-4ea0-87c9-54b59acfc772","Type":"ContainerStarted","Data":"f1ab1478642b6d83282b2c4139c42668a5e5410fb1f0dad818fe0c88038b1993"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.246246 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" event={"ID":"78e98b3d-733f-4b7a-abcf-950d6870c04f","Type":"ContainerStarted","Data":"55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.247806 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.247858 4965 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ndw52 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.247882 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" podUID="78e98b3d-733f-4b7a-abcf-950d6870c04f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.254092 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t99cv" event={"ID":"50a60857-8df9-45be-91b8-a41878677884","Type":"ContainerStarted","Data":"ce71bf40d2031bf2035e5208245f6e96cddad4fd8a9123bea2ef8a80f0787871"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.269748 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.269807 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-catalog-content\") pod \"certified-operators-s9jt9\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.269838 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvmjs\" (UniqueName: \"kubernetes.io/projected/1269cb10-777f-46e4-a52f-8088e7af6b2d-kube-api-access-nvmjs\") pod \"certified-operators-s9jt9\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.269896 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-utilities\") pod \"certified-operators-s9jt9\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: E1125 15:06:42.270177 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.770143235 +0000 UTC m=+147.737736981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.285996 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpjkd\" (UniqueName: \"kubernetes.io/projected/79a7c128-3c65-491c-95b6-52337183df64-kube-api-access-xpjkd\") pod \"community-operators-6blkb\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.291932 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" event={"ID":"17977cba-cde5-4d42-9c64-1c37568db595","Type":"ContainerStarted","Data":"440928bf925a068c3887197a7f5bdd1b59e45dfd4d1be760272b8d781aba92b7"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.297429 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" event={"ID":"439f7d1e-4721-42db-90a8-d9ce7bfb2302","Type":"ContainerStarted","Data":"fe9ff791559844b2ff07745d182b34581a306483780e5949a9fd1ed7a3ae02a8"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.299853 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zt5k2"] Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.301418 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" event={"ID":"107eeea6-61f3-4cc7-b51c-f3100d84f707","Type":"ContainerStarted","Data":"a25efec491fd624c11f77a680bfb4b9c7f3e7b1f011c10e8c9dd38f0bc6c212b"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.301522 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.323725 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" event={"ID":"ac3b054d-ca2f-4ca3-a3fa-6772cad2a377","Type":"ContainerStarted","Data":"77a80338e2ba240a5d6896de0be74f81f82acda2ed5870248c6eac6f06137b98"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.336955 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" event={"ID":"77477a76-df54-4755-89b0-9b2ec40e098d","Type":"ContainerStarted","Data":"7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.337868 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.352016 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zt5k2"] Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.353345 4965 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8rgvb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.353387 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" podUID="77477a76-df54-4755-89b0-9b2ec40e098d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.366198 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" event={"ID":"8252e47a-59aa-4a86-8f97-d12cc21fc6d8","Type":"ContainerStarted","Data":"9486ef0aafc7564982c915e51525a828a25cb69f47e60db7bb43d0506203a406"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.373653 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.373864 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-catalog-content\") pod \"certified-operators-s9jt9\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.373913 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmjs\" (UniqueName: \"kubernetes.io/projected/1269cb10-777f-46e4-a52f-8088e7af6b2d-kube-api-access-nvmjs\") pod \"certified-operators-s9jt9\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.374053 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-utilities\") pod \"certified-operators-s9jt9\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: E1125 15:06:42.374787 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.874773301 +0000 UTC m=+147.842367047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.377764 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-catalog-content\") pod \"certified-operators-s9jt9\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.381568 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-utilities\") pod \"certified-operators-s9jt9\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.416935 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksgsk" podStartSLOduration=125.41691932 podStartE2EDuration="2m5.41691932s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.415516571 +0000 UTC m=+147.383110317" watchObservedRunningTime="2025-11-25 15:06:42.41691932 +0000 UTC m=+147.384513066" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.424798 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvmjs\" (UniqueName: \"kubernetes.io/projected/1269cb10-777f-46e4-a52f-8088e7af6b2d-kube-api-access-nvmjs\") pod \"certified-operators-s9jt9\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.427433 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" event={"ID":"317839b7-786d-4e93-8b37-4dd23e4a5032","Type":"ContainerStarted","Data":"de6d4402f6ed4f81fba7c3d321f386375810f2105695c3fe70ecf37143c60c1b"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.464479 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.478548 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-utilities\") pod \"community-operators-zt5k2\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.478624 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-catalog-content\") pod \"community-operators-zt5k2\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.478672 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh2g4\" (UniqueName: \"kubernetes.io/projected/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-kube-api-access-qh2g4\") pod \"community-operators-zt5k2\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.481546 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:42 crc kubenswrapper[4965]: E1125 15:06:42.491920 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:42.99189794 +0000 UTC m=+147.959491686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.513145 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" podStartSLOduration=125.513109233 podStartE2EDuration="2m5.513109233s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.497392632 +0000 UTC m=+147.464986378" watchObservedRunningTime="2025-11-25 15:06:42.513109233 +0000 UTC m=+147.480702969" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.528232 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fvgv6"] Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.537376 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.541728 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.541744 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fvgv6"] Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.578093 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lxkkv" event={"ID":"920d465f-64d8-4954-acfe-4a12f1ea7739","Type":"ContainerStarted","Data":"81547c1f2c249aee02651ec1f1306043e4ad1d517d0c1bff3d19a892c4fa0d89"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.578140 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lxkkv" event={"ID":"920d465f-64d8-4954-acfe-4a12f1ea7739","Type":"ContainerStarted","Data":"39bb834e22971907af7d9f97bbcc6e9e99148a489a678034be512eef8bc7273d"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.592157 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:42 crc kubenswrapper[4965]: E1125 15:06:42.592415 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.092369411 +0000 UTC m=+148.059963157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.599528 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.599661 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-utilities\") pod \"community-operators-zt5k2\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.599696 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-catalog-content\") pod \"community-operators-zt5k2\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.599748 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh2g4\" (UniqueName: \"kubernetes.io/projected/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-kube-api-access-qh2g4\") pod \"community-operators-zt5k2\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:42 crc kubenswrapper[4965]: E1125 15:06:42.602172 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.10213045 +0000 UTC m=+148.069724196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.602313 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-utilities\") pod \"community-operators-zt5k2\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.603226 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6g7bs" podStartSLOduration=125.603204569 podStartE2EDuration="2m5.603204569s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.602777667 +0000 UTC m=+147.570371413" watchObservedRunningTime="2025-11-25 15:06:42.603204569 +0000 UTC m=+147.570798315" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.603314 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-catalog-content\") pod \"community-operators-zt5k2\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.645401 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" event={"ID":"91f26391-3d69-4625-a15b-286b29e14161","Type":"ContainerStarted","Data":"7630ece29cac1b02f4390b8e584932fe630e78dfc50f56550d231816329c51db"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.645455 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" event={"ID":"91f26391-3d69-4625-a15b-286b29e14161","Type":"ContainerStarted","Data":"2daddb53cca3dbbe76ee04f7c1dcfffc20bed79b6e305b104e65fdca1e47a819"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.663527 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh2g4\" (UniqueName: \"kubernetes.io/projected/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-kube-api-access-qh2g4\") pod \"community-operators-zt5k2\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.693693 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" event={"ID":"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1","Type":"ContainerStarted","Data":"a9480a59fae99887ce186b7a3beca72fb8cfc2292c464ff86c3c9e85447cac18"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.695039 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.705907 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5mpvj" event={"ID":"423ab6bf-9cad-43cd-af44-e0cee05b262b","Type":"ContainerStarted","Data":"c2c86024b75564f92dcc1eebbfb7e6467c60d1a142f1ab15c596c619e03699d9"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.706234 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.706521 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:42 crc kubenswrapper[4965]: E1125 15:06:42.706787 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.206638351 +0000 UTC m=+148.174232097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.706933 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnvk6\" (UniqueName: \"kubernetes.io/projected/d950b336-b79c-4b02-a695-66f4757027ca-kube-api-access-bnvk6\") pod \"certified-operators-fvgv6\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.707005 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-utilities\") pod \"certified-operators-fvgv6\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.707171 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.707284 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-catalog-content\") pod \"certified-operators-fvgv6\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:42 crc kubenswrapper[4965]: E1125 15:06:42.707598 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.207590818 +0000 UTC m=+148.175184564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.719929 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jv944" podStartSLOduration=125.719900926 podStartE2EDuration="2m5.719900926s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.718337173 +0000 UTC m=+147.685930919" watchObservedRunningTime="2025-11-25 15:06:42.719900926 +0000 UTC m=+147.687494672" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.721465 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rm7qh" podStartSLOduration=125.72145888 podStartE2EDuration="2m5.72145888s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.665983115 +0000 UTC m=+147.633576881" watchObservedRunningTime="2025-11-25 15:06:42.72145888 +0000 UTC m=+147.689052626" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.731086 4965 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nld56 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.731141 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" podUID="2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.732218 4965 patch_prober.go:28] interesting pod/console-operator-58897d9998-5mpvj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.732261 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5mpvj" podUID="423ab6bf-9cad-43cd-af44-e0cee05b262b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.734421 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" event={"ID":"30b30333-fbb1-476f-8bf2-146f2ee696a7","Type":"ContainerStarted","Data":"da9e1acd92edc1d2f57d7ec751e1658637f37c0f1a77c123b67c22374e995ab9"} Nov 25 15:06:42 crc kubenswrapper[4965]: I1125 15:06:42.735278 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.755536 4965 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-268bv container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" start-of-body= Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.755600 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" podUID="30b30333-fbb1-476f-8bf2-146f2ee696a7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.755783 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gjnb4" event={"ID":"74a6fca6-52bf-4568-baa8-5bbcd0904722","Type":"ContainerStarted","Data":"ca702e69d1207a377ce7f4e12c94973c4c5d0badd23d19e172946a9a38a83e26"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.807973 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.808204 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-catalog-content\") pod \"certified-operators-fvgv6\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.808230 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnvk6\" (UniqueName: \"kubernetes.io/projected/d950b336-b79c-4b02-a695-66f4757027ca-kube-api-access-bnvk6\") pod \"certified-operators-fvgv6\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.808267 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-utilities\") pod \"certified-operators-fvgv6\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.808683 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-utilities\") pod \"certified-operators-fvgv6\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:42.808743 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.308731048 +0000 UTC m=+148.276324794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.810417 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-catalog-content\") pod \"certified-operators-fvgv6\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.821111 4965 generic.go:334] "Generic (PLEG): container finished" podID="d55c18b9-bb80-428c-95ea-f21c6b0694e4" containerID="e227a41ca53bf6bb46bd5d9b4c44e59101a84fa77f249d02c37b8f152aa04561" exitCode=0 Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.824774 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" podStartSLOduration=125.824759208 podStartE2EDuration="2m5.824759208s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.821513989 +0000 UTC m=+147.789107735" watchObservedRunningTime="2025-11-25 15:06:42.824759208 +0000 UTC m=+147.792352954" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.825400 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" podStartSLOduration=125.825392726 podStartE2EDuration="2m5.825392726s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.78773391 +0000 UTC m=+147.755327656" watchObservedRunningTime="2025-11-25 15:06:42.825392726 +0000 UTC m=+147.792986472" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.848097 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" podStartSLOduration=125.848080249 podStartE2EDuration="2m5.848080249s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.846185037 +0000 UTC m=+147.813778783" watchObservedRunningTime="2025-11-25 15:06:42.848080249 +0000 UTC m=+147.815674015" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.864929 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnvk6\" (UniqueName: \"kubernetes.io/projected/d950b336-b79c-4b02-a695-66f4757027ca-kube-api-access-bnvk6\") pod \"certified-operators-fvgv6\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.889322 4965 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fjgbs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.889369 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" podUID="8a01a344-a2a2-4d3c-9bc3-5e911936606c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.909681 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:42.911698 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.411681816 +0000 UTC m=+148.379275562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.949151 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:43 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:43 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:43 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.949194 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.970648 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tbt5x" podStartSLOduration=125.970631666 podStartE2EDuration="2m5.970631666s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.933549398 +0000 UTC m=+147.901143144" watchObservedRunningTime="2025-11-25 15:06:42.970631666 +0000 UTC m=+147.938225412" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.988674 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-2pf5t container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.988716 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2pf5t" podUID="b6686556-54b1-4232-a5ec-7bacd966ff86" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.996784 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" podStartSLOduration=125.996768555 podStartE2EDuration="2m5.996768555s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.971498391 +0000 UTC m=+147.939092127" watchObservedRunningTime="2025-11-25 15:06:42.996768555 +0000 UTC m=+147.964362311" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:42.997725 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-lxkkv" podStartSLOduration=6.997717771 podStartE2EDuration="6.997717771s" podCreationTimestamp="2025-11-25 15:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:42.996437226 +0000 UTC m=+147.964030972" watchObservedRunningTime="2025-11-25 15:06:42.997717771 +0000 UTC m=+147.965311517" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.017735 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.020501 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.520475136 +0000 UTC m=+148.488068882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039717 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039752 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" event={"ID":"319bbac1-c3c2-46c2-b0c2-efa7a9cc6c03","Type":"ContainerStarted","Data":"bd7f4f0e2b1d384f23fc5b3974e501b0c1dbed6f2fae74368dcbaac9f419ac5f"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039769 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" event={"ID":"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a","Type":"ContainerStarted","Data":"e20beb0854d305b63b3a390e6186a97cb6bcf36b71530a02d770432cf0f26b8c"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039781 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" event={"ID":"547f6e0d-0e57-4ebd-94bb-1cff5d6fedef","Type":"ContainerStarted","Data":"80be2dbe777969d116ba1e234d8669d275314ce29e082cd603df6ad647c85e81"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039792 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" event={"ID":"d55c18b9-bb80-428c-95ea-f21c6b0694e4","Type":"ContainerDied","Data":"e227a41ca53bf6bb46bd5d9b4c44e59101a84fa77f249d02c37b8f152aa04561"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039803 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" event={"ID":"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390","Type":"ContainerStarted","Data":"98d61173b0e4b6e551d808a6da5ab58ae1b77ed2164fd9cd89e2040e5a0389b3"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039812 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" event={"ID":"4ee3b773-c658-424e-ab4d-c6ba3e866ce2","Type":"ContainerStarted","Data":"45451ba98f2ff30858301ceeccf68eb3f445d1c96c57a595f8aa00170f6130b6"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039827 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" event={"ID":"7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263","Type":"ContainerStarted","Data":"c1091e338232029666339e7ae34a65111b13c02b1989d43019a319349ea12272"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039854 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" event={"ID":"ded91b12-10e5-4c46-aaa7-e54c34072789","Type":"ContainerStarted","Data":"f91f27fba0e492aecba938ba2d76b1a7f46c360ab1e45fd201bef6edcde2d58c"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039864 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" event={"ID":"8a01a344-a2a2-4d3c-9bc3-5e911936606c","Type":"ContainerStarted","Data":"63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039872 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" event={"ID":"9953c224-6f16-4a0d-ad9a-3ea1e4914499","Type":"ContainerStarted","Data":"a42db7853acecf4b75c646be709603a1eff9948fb1719da03c21801c5e4651c0"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039882 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" event={"ID":"8debd4d2-5319-4aa1-bd2d-aee777eba0ba","Type":"ContainerStarted","Data":"de966e79c9746ad6a3528d7da8aa21f7fff2498e43fd3f3d311161e5077ae3dd"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039890 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" event={"ID":"8debd4d2-5319-4aa1-bd2d-aee777eba0ba","Type":"ContainerStarted","Data":"b135d33015ed9aecaf099550d71c534af4082d2ab02d3948e0f6875ba8469be4"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039900 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mlsxj" event={"ID":"59cdf509-5573-4cea-acca-7a5830b58bcf","Type":"ContainerStarted","Data":"441c22f682a5b528acb7a73ff7b7e4b0f967ec0a904d910582bc30c03f70f5d5"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039909 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" event={"ID":"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb","Type":"ContainerStarted","Data":"707776be6bc122b83a9e97e1bb2d9d83899fc7a9e10e225ec2eca34e1d92b3c0"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.039918 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" event={"ID":"a9f89ef7-da88-4a5b-a0ca-5474493bf82f","Type":"ContainerStarted","Data":"bb8583614a1fe63e87f74aee14921174d7aa9e077745cd34e48651ee16ff9306"} Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.040754 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.043878 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.54386252 +0000 UTC m=+148.511456256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.063814 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.116834 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" podStartSLOduration=126.116815835 podStartE2EDuration="2m6.116815835s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:43.063728116 +0000 UTC m=+148.031321862" watchObservedRunningTime="2025-11-25 15:06:43.116815835 +0000 UTC m=+148.084409581" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.143777 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.144541 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.644506066 +0000 UTC m=+148.612099822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.153730 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.154271 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.154916 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.654899531 +0000 UTC m=+148.622493277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.198076 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" podStartSLOduration=126.198056687 podStartE2EDuration="2m6.198056687s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:43.154739396 +0000 UTC m=+148.122333142" watchObservedRunningTime="2025-11-25 15:06:43.198056687 +0000 UTC m=+148.165650433" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.218318 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" podStartSLOduration=126.218302674 podStartE2EDuration="2m6.218302674s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:43.200059042 +0000 UTC m=+148.167652788" watchObservedRunningTime="2025-11-25 15:06:43.218302674 +0000 UTC m=+148.185896410" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.255090 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.256163 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.756141583 +0000 UTC m=+148.723735339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.274357 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5mpvj" podStartSLOduration=126.274334123 podStartE2EDuration="2m6.274334123s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:43.253443649 +0000 UTC m=+148.221037395" watchObservedRunningTime="2025-11-25 15:06:43.274334123 +0000 UTC m=+148.241927869" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.292408 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-67kvn" podStartSLOduration=126.292394359 podStartE2EDuration="2m6.292394359s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:43.290551809 +0000 UTC m=+148.258145555" watchObservedRunningTime="2025-11-25 15:06:43.292394359 +0000 UTC m=+148.259988105" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.332369 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" podStartSLOduration=126.332350138 podStartE2EDuration="2m6.332350138s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:43.330949969 +0000 UTC m=+148.298543715" watchObservedRunningTime="2025-11-25 15:06:43.332350138 +0000 UTC m=+148.299943884" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.357929 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.358266 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.85825517 +0000 UTC m=+148.825848916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.406848 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g9mfz" podStartSLOduration=126.406830864 podStartE2EDuration="2m6.406830864s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:43.40521801 +0000 UTC m=+148.372811756" watchObservedRunningTime="2025-11-25 15:06:43.406830864 +0000 UTC m=+148.374424610" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.447209 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gb2gn" podStartSLOduration=126.447180734 podStartE2EDuration="2m6.447180734s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:43.435129152 +0000 UTC m=+148.402722898" watchObservedRunningTime="2025-11-25 15:06:43.447180734 +0000 UTC m=+148.414774480" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.464357 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.464730 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:43.964715305 +0000 UTC m=+148.932309051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.566276 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lh8z6" podStartSLOduration=126.566256636 podStartE2EDuration="2m6.566256636s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:43.565828414 +0000 UTC m=+148.533422160" watchObservedRunningTime="2025-11-25 15:06:43.566256636 +0000 UTC m=+148.533850372" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.566764 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.567126 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.067115759 +0000 UTC m=+149.034709505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.568195 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" podStartSLOduration=126.568187279 podStartE2EDuration="2m6.568187279s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:43.534703449 +0000 UTC m=+148.502297195" watchObservedRunningTime="2025-11-25 15:06:43.568187279 +0000 UTC m=+148.535781015" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.669474 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.669799 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.169783931 +0000 UTC m=+149.137377677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.772731 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.773258 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.273246695 +0000 UTC m=+149.240840441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.887571 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.887796 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.387780462 +0000 UTC m=+149.355374208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.887857 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.888164 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.388156243 +0000 UTC m=+149.355749989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.956035 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:43 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:43 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:43 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.956412 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.995918 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.996248 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.496194622 +0000 UTC m=+149.463788378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:43 crc kubenswrapper[4965]: I1125 15:06:43.996398 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:43 crc kubenswrapper[4965]: E1125 15:06:43.996762 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.496749667 +0000 UTC m=+149.464343413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.014596 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ndvs4"] Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.015842 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.043921 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.050783 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndvs4"] Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.062371 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" event={"ID":"439f7d1e-4721-42db-90a8-d9ce7bfb2302","Type":"ContainerStarted","Data":"d7aa1ea0ab71271d7be76157c43a682bb6b0556ac333068c72fd23f7d725b234"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.098323 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.098845 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-catalog-content\") pod \"redhat-marketplace-ndvs4\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.098876 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-utilities\") pod \"redhat-marketplace-ndvs4\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.098898 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs9tv\" (UniqueName: \"kubernetes.io/projected/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-kube-api-access-xs9tv\") pod \"redhat-marketplace-ndvs4\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.099035 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.599020747 +0000 UTC m=+149.566614493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.108274 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5mpvj" event={"ID":"423ab6bf-9cad-43cd-af44-e0cee05b262b","Type":"ContainerStarted","Data":"65065b37dd9f113fa5dbdb2dd5d1b5e342ea2839c0e1096ff7338f5264b6cb98"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.109275 4965 patch_prober.go:28] interesting pod/console-operator-58897d9998-5mpvj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.109340 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5mpvj" podUID="423ab6bf-9cad-43cd-af44-e0cee05b262b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.124259 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmhs2" event={"ID":"8252e47a-59aa-4a86-8f97-d12cc21fc6d8","Type":"ContainerStarted","Data":"8492da87092103796b8337a3590442031d590bd367872f6374bf65bdc95f271a"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.124390 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.165457 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" event={"ID":"8944b4a9-8e7d-4a3a-a526-11a31e795453","Type":"ContainerStarted","Data":"79ec38b6d286fa6647df0212d451a300b9f4063f329bd665b54541d888304f88"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.167193 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.176068 4965 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-frm99 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.176132 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" podUID="8944b4a9-8e7d-4a3a-a526-11a31e795453" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.177172 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gjnb4" event={"ID":"74a6fca6-52bf-4568-baa8-5bbcd0904722","Type":"ContainerStarted","Data":"f7b5b36b99ebae07ebe6940201d690c75141cfe52dd3803d57254566274afd0e"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.180083 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7hnpl" event={"ID":"573dcfaf-c9ea-4c83-b2b1-8c54c2f7d390","Type":"ContainerStarted","Data":"00d37ce3d07a2902872da68a7b2b8b3fd5eb14c7014f926db03ce5bbb945dd0d"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.199486 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.199568 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-catalog-content\") pod \"redhat-marketplace-ndvs4\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.199605 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-utilities\") pod \"redhat-marketplace-ndvs4\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.199628 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs9tv\" (UniqueName: \"kubernetes.io/projected/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-kube-api-access-xs9tv\") pod \"redhat-marketplace-ndvs4\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.207456 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-catalog-content\") pod \"redhat-marketplace-ndvs4\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.208280 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.70826491 +0000 UTC m=+149.675858656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.208384 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-utilities\") pod \"redhat-marketplace-ndvs4\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.208740 4965 generic.go:334] "Generic (PLEG): container finished" podID="d3d12281-f14b-432b-bc6a-7d8d4b9e933e" containerID="afcd09e531da8986c60003695163c045a93329615d05f56eb840932790e83976" exitCode=0 Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.208958 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" event={"ID":"d3d12281-f14b-432b-bc6a-7d8d4b9e933e","Type":"ContainerDied","Data":"afcd09e531da8986c60003695163c045a93329615d05f56eb840932790e83976"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.239950 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs9tv\" (UniqueName: \"kubernetes.io/projected/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-kube-api-access-xs9tv\") pod \"redhat-marketplace-ndvs4\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.267140 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" event={"ID":"d97be63b-6ce3-44ae-b820-25260bf392bf","Type":"ContainerStarted","Data":"c554adfd20abb52536f40ef33a32af407af0df3a25b206075ebe43d4bcf4a4e9"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.269287 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kp5tp" podStartSLOduration=127.269267166 podStartE2EDuration="2m7.269267166s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.159435308 +0000 UTC m=+149.127029054" watchObservedRunningTime="2025-11-25 15:06:44.269267166 +0000 UTC m=+149.236860912" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.269539 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.301623 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.302062 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.802043268 +0000 UTC m=+149.769637014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.302232 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.305476 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.805464231 +0000 UTC m=+149.773057977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.311412 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" event={"ID":"ded91b12-10e5-4c46-aaa7-e54c34072789","Type":"ContainerStarted","Data":"694d67767de9dc53d02546aaa675aa56d44b6c88a400519ee8f06ce4defebe68"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.324418 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9jt9"] Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.335306 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" event={"ID":"d55c18b9-bb80-428c-95ea-f21c6b0694e4","Type":"ContainerStarted","Data":"3a9c3344c73d2d449f0f5f49ba6f4e6f04f54d230de60892baffa420e063ec62"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.337237 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t" event={"ID":"6e339e05-6937-4fca-b8f6-060917c05b2c","Type":"ContainerStarted","Data":"1beba40c78c4a385fe248c37b8bbbd96071b831fb6112b0046bd3a196c453eef"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.337266 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t" event={"ID":"6e339e05-6937-4fca-b8f6-060917c05b2c","Type":"ContainerStarted","Data":"f26a332ba940c55b3c41521b2b722edc15705f3f0f715b649fc7aac28fb236c8"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.355941 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g9kt4"] Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.367046 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.392398 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9kt4"] Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.396690 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gjnb4" podStartSLOduration=8.396661738 podStartE2EDuration="8.396661738s" podCreationTimestamp="2025-11-25 15:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.362881699 +0000 UTC m=+149.330475445" watchObservedRunningTime="2025-11-25 15:06:44.396661738 +0000 UTC m=+149.364255484" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.397129 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.403277 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.403521 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhvtz\" (UniqueName: \"kubernetes.io/projected/be82ef6f-bd03-4c9f-a760-d836fccf52a7-kube-api-access-zhvtz\") pod \"redhat-marketplace-g9kt4\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.403546 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-catalog-content\") pod \"redhat-marketplace-g9kt4\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.403646 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:44.903616719 +0000 UTC m=+149.871210465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.403748 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-utilities\") pod \"redhat-marketplace-g9kt4\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.438241 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b6gl2" event={"ID":"317839b7-786d-4e93-8b37-4dd23e4a5032","Type":"ContainerStarted","Data":"029c60ca038ffd7c860f54167b233187b17eb6e452869add4951b95e39af1ae7"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.457292 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" event={"ID":"a9f89ef7-da88-4a5b-a0ca-5474493bf82f","Type":"ContainerStarted","Data":"87f03ba8cb74740cb603695ac5f337c9e78320b305697e1688b8e9699db45e66"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.472754 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wb26t" podStartSLOduration=127.472737559 podStartE2EDuration="2m7.472737559s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.472329348 +0000 UTC m=+149.439923084" watchObservedRunningTime="2025-11-25 15:06:44.472737559 +0000 UTC m=+149.440331295" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.473124 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" podStartSLOduration=127.473119309 podStartE2EDuration="2m7.473119309s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.442329413 +0000 UTC m=+149.409923159" watchObservedRunningTime="2025-11-25 15:06:44.473119309 +0000 UTC m=+149.440713055" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.515502 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhvtz\" (UniqueName: \"kubernetes.io/projected/be82ef6f-bd03-4c9f-a760-d836fccf52a7-kube-api-access-zhvtz\") pod \"redhat-marketplace-g9kt4\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.516218 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-catalog-content\") pod \"redhat-marketplace-g9kt4\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.516387 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.516488 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-utilities\") pod \"redhat-marketplace-g9kt4\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.516871 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-catalog-content\") pod \"redhat-marketplace-g9kt4\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.517344 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.017333324 +0000 UTC m=+149.984927070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.520706 4965 generic.go:334] "Generic (PLEG): container finished" podID="ed9961f3-d3a0-47aa-a9c4-e311468d76ff" containerID="fad0d99dbfb26ade2ed8dfd131716bb5f5008e6943b85aca209c3dfcc178aed1" exitCode=0 Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.520777 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" event={"ID":"ed9961f3-d3a0-47aa-a9c4-e311468d76ff","Type":"ContainerDied","Data":"fad0d99dbfb26ade2ed8dfd131716bb5f5008e6943b85aca209c3dfcc178aed1"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.521605 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-utilities\") pod \"redhat-marketplace-g9kt4\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.547738 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" podStartSLOduration=127.547725229 podStartE2EDuration="2m7.547725229s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.534016853 +0000 UTC m=+149.501610599" watchObservedRunningTime="2025-11-25 15:06:44.547725229 +0000 UTC m=+149.515318975" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.555326 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" event={"ID":"547f6e0d-0e57-4ebd-94bb-1cff5d6fedef","Type":"ContainerStarted","Data":"97d3a6ce2bf41437b665046b41b7a15d97add8c50a15cf56bf7b5c2962fdf89d"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.583987 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhvtz\" (UniqueName: \"kubernetes.io/projected/be82ef6f-bd03-4c9f-a760-d836fccf52a7-kube-api-access-zhvtz\") pod \"redhat-marketplace-g9kt4\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.602911 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xtpw9" podStartSLOduration=127.602894615 podStartE2EDuration="2m7.602894615s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.601565199 +0000 UTC m=+149.569158935" watchObservedRunningTime="2025-11-25 15:06:44.602894615 +0000 UTC m=+149.570488361" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.603690 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" event={"ID":"7aec0aba-2d0b-49fb-a6ac-8f5ed7bf9263","Type":"ContainerStarted","Data":"3df700bf70a981cc5feeaae4bc60a9e51516026d4cd611f65899b1f0a4b395b7"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.619493 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" event={"ID":"30f89119-a5f1-41eb-822c-fdd243818a4b","Type":"ContainerStarted","Data":"5f39e1a23801529033aacc1639fd12658bb0247a425389b11eed7bd4c5f18d5b"} Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.621131 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.121102466 +0000 UTC m=+150.088696212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.631096 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.631536 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.633383 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.133366223 +0000 UTC m=+150.100959969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.646120 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" podStartSLOduration=127.646102884 podStartE2EDuration="2m7.646102884s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.643583784 +0000 UTC m=+149.611177530" watchObservedRunningTime="2025-11-25 15:06:44.646102884 +0000 UTC m=+149.613696630" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.647015 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxkng" event={"ID":"107eeea6-61f3-4cc7-b51c-f3100d84f707","Type":"ContainerStarted","Data":"656868e969a29399e6cb393bd0e2af930a761324c25e3e595580a599099893d8"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.649757 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" event={"ID":"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb","Type":"ContainerStarted","Data":"c9fff3697f84e5b7010df627a2098cb447a53ba2ccf989e72f258a70bbf5b681"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.649800 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" event={"ID":"ef9cb97a-abf5-477c-bcde-ed43ba1a80fb","Type":"ContainerStarted","Data":"c4ad55cb9ca2eb18f137275b2cc2d8205a8867629a283372d20b2438da756673"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.695022 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mlsxj" event={"ID":"59cdf509-5573-4cea-acca-7a5830b58bcf","Type":"ContainerStarted","Data":"5d1493c381afa5190b1330ed3289ef15bc8f5aca22f03d0baa62e9c412eb6d8f"} Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.695274 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.703185 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-2pf5t container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.703248 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2pf5t" podUID="b6686556-54b1-4232-a5ec-7bacd966ff86" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.704427 4965 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nld56 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.704451 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" podUID="2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.733168 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.734723 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.234709808 +0000 UTC m=+150.202303554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.734783 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.734944 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.740981 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-l75ns" podStartSLOduration=127.740921368 podStartE2EDuration="2m7.740921368s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.734137832 +0000 UTC m=+149.701731588" watchObservedRunningTime="2025-11-25 15:06:44.740921368 +0000 UTC m=+149.708515114" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.745317 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.788843 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lcr9f" podStartSLOduration=127.788820736 podStartE2EDuration="2m7.788820736s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.788350472 +0000 UTC m=+149.755944228" watchObservedRunningTime="2025-11-25 15:06:44.788820736 +0000 UTC m=+149.756414482" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.838840 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.840337 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.34029545 +0000 UTC m=+150.307889196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.869731 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mlsxj" podStartSLOduration=8.869708248 podStartE2EDuration="8.869708248s" podCreationTimestamp="2025-11-25 15:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.85338164 +0000 UTC m=+149.820975386" watchObservedRunningTime="2025-11-25 15:06:44.869708248 +0000 UTC m=+149.837301994" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.935677 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wkf6l" podStartSLOduration=127.93564515 podStartE2EDuration="2m7.93564515s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:44.934582122 +0000 UTC m=+149.902175868" watchObservedRunningTime="2025-11-25 15:06:44.93564515 +0000 UTC m=+149.903238896" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.948339 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:44 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:44 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:44 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.948674 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.948567 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.948631 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.448615817 +0000 UTC m=+150.416209563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.949284 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.949424 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fvgv6"] Nov 25 15:06:44 crc kubenswrapper[4965]: E1125 15:06:44.949519 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.449511462 +0000 UTC m=+150.417105208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.952571 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zt5k2"] Nov 25 15:06:44 crc kubenswrapper[4965]: I1125 15:06:44.961413 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6blkb"] Nov 25 15:06:45 crc kubenswrapper[4965]: W1125 15:06:45.003131 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd950b336_b79c_4b02_a695_66f4757027ca.slice/crio-7400c79cc0b4908206074b8336dd7ff9980fc5614b36e0ac13a3c57f26c78abd WatchSource:0}: Error finding container 7400c79cc0b4908206074b8336dd7ff9980fc5614b36e0ac13a3c57f26c78abd: Status 404 returned error can't find the container with id 7400c79cc0b4908206074b8336dd7ff9980fc5614b36e0ac13a3c57f26c78abd Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.049807 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.050238 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.550219859 +0000 UTC m=+150.517813605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:45 crc kubenswrapper[4965]: W1125 15:06:45.078947 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0316a5a7_25ae_44be_ab7e_f3499e04aa8e.slice/crio-b7da68ca6d241602861c4f0d6370bfb7cb940970a9c96f5542fc0d733f9bdc01 WatchSource:0}: Error finding container b7da68ca6d241602861c4f0d6370bfb7cb940970a9c96f5542fc0d733f9bdc01: Status 404 returned error can't find the container with id b7da68ca6d241602861c4f0d6370bfb7cb940970a9c96f5542fc0d733f9bdc01 Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.113667 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j8f94"] Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.114584 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.121348 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.150739 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j8f94"] Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.151871 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.152185 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.652174621 +0000 UTC m=+150.619768367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.252545 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.252993 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.752955031 +0000 UTC m=+150.720548777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.253218 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-catalog-content\") pod \"redhat-operators-j8f94\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.253267 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.253284 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwgp\" (UniqueName: \"kubernetes.io/projected/2fccc0df-85ec-4aeb-9217-00c37ea16e67-kube-api-access-2wwgp\") pod \"redhat-operators-j8f94\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.253311 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-utilities\") pod \"redhat-operators-j8f94\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.253546 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.753539126 +0000 UTC m=+150.721132872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.354489 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.354735 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-catalog-content\") pod \"redhat-operators-j8f94\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.354792 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwgp\" (UniqueName: \"kubernetes.io/projected/2fccc0df-85ec-4aeb-9217-00c37ea16e67-kube-api-access-2wwgp\") pod \"redhat-operators-j8f94\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.354815 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-utilities\") pod \"redhat-operators-j8f94\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.355290 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-utilities\") pod \"redhat-operators-j8f94\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.355507 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-catalog-content\") pod \"redhat-operators-j8f94\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.355568 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.855554361 +0000 UTC m=+150.823148097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.411725 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwgp\" (UniqueName: \"kubernetes.io/projected/2fccc0df-85ec-4aeb-9217-00c37ea16e67-kube-api-access-2wwgp\") pod \"redhat-operators-j8f94\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.452118 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.456534 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.456869 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:45.956856824 +0000 UTC m=+150.924450570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.523893 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kpgvl"] Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.525192 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.558086 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.558595 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.0585776 +0000 UTC m=+151.026171346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.627132 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpgvl"] Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.659905 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-utilities\") pod \"redhat-operators-kpgvl\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.659991 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.660013 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-catalog-content\") pod \"redhat-operators-kpgvl\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.660047 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxhdz\" (UniqueName: \"kubernetes.io/projected/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-kube-api-access-cxhdz\") pod \"redhat-operators-kpgvl\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.660370 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.160359678 +0000 UTC m=+151.127953424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.697534 4965 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-268bv container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.697588 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" podUID="30b30333-fbb1-476f-8bf2-146f2ee696a7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.705020 4965 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fjgbs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.705087 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" podUID="8a01a344-a2a2-4d3c-9bc3-5e911936606c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.718068 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvgv6" event={"ID":"d950b336-b79c-4b02-a695-66f4757027ca","Type":"ContainerStarted","Data":"7400c79cc0b4908206074b8336dd7ff9980fc5614b36e0ac13a3c57f26c78abd"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.732155 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6blkb" event={"ID":"79a7c128-3c65-491c-95b6-52337183df64","Type":"ContainerStarted","Data":"4f68fbdac800b963148815bbba9df5458f6c9ff77a7c027793e18d520d0c86bb"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.734262 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" event={"ID":"d3d12281-f14b-432b-bc6a-7d8d4b9e933e","Type":"ContainerStarted","Data":"177416d7b9d7646e41b272bbde3613d27e65892198aee3b2a695d3e7327f0949"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.734983 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.736255 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t99cv" event={"ID":"50a60857-8df9-45be-91b8-a41878677884","Type":"ContainerStarted","Data":"0513892962eaf267119210a8cd273adeff5b92d66ba5959542c99cd33af49b7f"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.737949 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" event={"ID":"ed9961f3-d3a0-47aa-a9c4-e311468d76ff","Type":"ContainerStarted","Data":"88a8007d6f9c32b2c9bab35887e5c161a0e5af2af3e829285786fa7c720bac19"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.753573 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mlsxj" event={"ID":"59cdf509-5573-4cea-acca-7a5830b58bcf","Type":"ContainerStarted","Data":"257d6a26503a3c2719cada7473fd205a2407abe9dfdaf09b042956b4ab0d92eb"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.760933 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.761356 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.761427 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-catalog-content\") pod \"redhat-operators-kpgvl\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.761481 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.761513 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxhdz\" (UniqueName: \"kubernetes.io/projected/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-kube-api-access-cxhdz\") pod \"redhat-operators-kpgvl\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.761564 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.761653 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.761724 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-utilities\") pod \"redhat-operators-kpgvl\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.762179 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.262154395 +0000 UTC m=+151.229748141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.762772 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-utilities\") pod \"redhat-operators-kpgvl\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.763182 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-catalog-content\") pod \"redhat-operators-kpgvl\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.763950 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.776788 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kd826" event={"ID":"547f6e0d-0e57-4ebd-94bb-1cff5d6fedef","Type":"ContainerStarted","Data":"362453a93216344c97866b646dac4c7cb2ed580cdac904a7320da34892550a40"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.778526 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.778575 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.781416 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.788728 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.795289 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.803195 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt5k2" event={"ID":"0316a5a7-25ae-44be-ab7e-f3499e04aa8e","Type":"ContainerStarted","Data":"b7da68ca6d241602861c4f0d6370bfb7cb940970a9c96f5542fc0d733f9bdc01"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.805800 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" podStartSLOduration=128.805787024 podStartE2EDuration="2m8.805787024s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:45.785539707 +0000 UTC m=+150.753133453" watchObservedRunningTime="2025-11-25 15:06:45.805787024 +0000 UTC m=+150.773380760" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.810644 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxhdz\" (UniqueName: \"kubernetes.io/projected/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-kube-api-access-cxhdz\") pod \"redhat-operators-kpgvl\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.818449 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" event={"ID":"d55c18b9-bb80-428c-95ea-f21c6b0694e4","Type":"ContainerStarted","Data":"9defe1bd9066b13ae595add1abb2d80d096084109271422071acbdd13ec86810"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.824192 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" event={"ID":"ded91b12-10e5-4c46-aaa7-e54c34072789","Type":"ContainerStarted","Data":"c42d6f93cf7bd1a6ea66252b6ac6bc5674a24d614638421a85e73683cfcb41f6"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.850697 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9jt9" event={"ID":"1269cb10-777f-46e4-a52f-8088e7af6b2d","Type":"ContainerStarted","Data":"2402bfe97d6421b40be7d1bdf96b6779d2cb8ddeba0f9968a153576105522ddc"} Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.851815 4965 patch_prober.go:28] interesting pod/console-operator-58897d9998-5mpvj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.851861 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5mpvj" podUID="423ab6bf-9cad-43cd-af44-e0cee05b262b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.865038 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.873105 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.875444 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.375426427 +0000 UTC m=+151.343020183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.894736 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" podStartSLOduration=128.894712828 podStartE2EDuration="2m8.894712828s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:45.887781858 +0000 UTC m=+150.855375614" watchObservedRunningTime="2025-11-25 15:06:45.894712828 +0000 UTC m=+150.862306564" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.961121 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frm99" Nov 25 15:06:45 crc kubenswrapper[4965]: I1125 15:06:45.986192 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:45 crc kubenswrapper[4965]: E1125 15:06:45.988444 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.488420263 +0000 UTC m=+151.456014009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:45.996234 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.002337 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:46 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:46 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:46 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.002387 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.002988 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" podStartSLOduration=129.002960503 podStartE2EDuration="2m9.002960503s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:46.002004887 +0000 UTC m=+150.969598623" watchObservedRunningTime="2025-11-25 15:06:46.002960503 +0000 UTC m=+150.970554249" Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.089780 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.090222 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.59020499 +0000 UTC m=+151.557798736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.153767 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lcspp" podStartSLOduration=129.153745147 podStartE2EDuration="2m9.153745147s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:46.060292898 +0000 UTC m=+151.027886644" watchObservedRunningTime="2025-11-25 15:06:46.153745147 +0000 UTC m=+151.121338893" Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.192038 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.192300 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.692260026 +0000 UTC m=+151.659853772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.192508 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.192957 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.692944574 +0000 UTC m=+151.660538320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.294247 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.294422 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.794396162 +0000 UTC m=+151.761989908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.294678 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.295023 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.795012559 +0000 UTC m=+151.762606305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.323700 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndvs4"] Nov 25 15:06:46 crc kubenswrapper[4965]: W1125 15:06:46.344031 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd34d943_ba97_48d9_bb26_cfa6d0ec549b.slice/crio-901f47277456c092ddd972267f7d23450433ac248fa4af25079317ee048ae0c8 WatchSource:0}: Error finding container 901f47277456c092ddd972267f7d23450433ac248fa4af25079317ee048ae0c8: Status 404 returned error can't find the container with id 901f47277456c092ddd972267f7d23450433ac248fa4af25079317ee048ae0c8 Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.352881 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9kt4"] Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.395891 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.396191 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.896156738 +0000 UTC m=+151.863750494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.396476 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.397028 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.897015343 +0000 UTC m=+151.864609089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: W1125 15:06:46.396482 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe82ef6f_bd03_4c9f_a760_d836fccf52a7.slice/crio-aee938dc4ed33beb99edd36f3fac89153eaa8eec16d34ab027c7251ddfae1e30 WatchSource:0}: Error finding container aee938dc4ed33beb99edd36f3fac89153eaa8eec16d34ab027c7251ddfae1e30: Status 404 returned error can't find the container with id aee938dc4ed33beb99edd36f3fac89153eaa8eec16d34ab027c7251ddfae1e30 Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.498014 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.498338 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:46.998323707 +0000 UTC m=+151.965917453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.599414 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.599929 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.099871067 +0000 UTC m=+152.067464813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.700863 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.701085 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.201056068 +0000 UTC m=+152.168649814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.701192 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.701452 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.201440768 +0000 UTC m=+152.169034514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.803416 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.803594 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.303560295 +0000 UTC m=+152.271154041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.803667 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.803946 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.303935425 +0000 UTC m=+152.271529171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.867302 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t99cv" event={"ID":"50a60857-8df9-45be-91b8-a41878677884","Type":"ContainerStarted","Data":"bb2d79780d6263c3018e021beb40816564c4d9bf8e28fc38991d55c5bea0dd84"} Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.871240 4965 generic.go:334] "Generic (PLEG): container finished" podID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerID="c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d" exitCode=0 Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.871302 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9jt9" event={"ID":"1269cb10-777f-46e4-a52f-8088e7af6b2d","Type":"ContainerDied","Data":"c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d"} Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.873389 4965 generic.go:334] "Generic (PLEG): container finished" podID="d950b336-b79c-4b02-a695-66f4757027ca" containerID="9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1" exitCode=0 Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.873805 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvgv6" event={"ID":"d950b336-b79c-4b02-a695-66f4757027ca","Type":"ContainerDied","Data":"9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1"} Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.882601 4965 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.883120 4965 generic.go:334] "Generic (PLEG): container finished" podID="79a7c128-3c65-491c-95b6-52337183df64" containerID="59de2e8747e9de7029ff7b949030bb9ce7deeeb86e69e48dad55ca4b62e68484" exitCode=0 Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.883192 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6blkb" event={"ID":"79a7c128-3c65-491c-95b6-52337183df64","Type":"ContainerDied","Data":"59de2e8747e9de7029ff7b949030bb9ce7deeeb86e69e48dad55ca4b62e68484"} Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.885864 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndvs4" event={"ID":"cd34d943-ba97-48d9-bb26-cfa6d0ec549b","Type":"ContainerStarted","Data":"6f4ae749b08f1505ccb07a1541823399da8b6b6309573d7f4d3c0e562108c09e"} Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.885887 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndvs4" event={"ID":"cd34d943-ba97-48d9-bb26-cfa6d0ec549b","Type":"ContainerStarted","Data":"901f47277456c092ddd972267f7d23450433ac248fa4af25079317ee048ae0c8"} Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.901342 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9kt4" event={"ID":"be82ef6f-bd03-4c9f-a760-d836fccf52a7","Type":"ContainerStarted","Data":"b051a7237f72dc7af7c4e4be4425ef060da5785a432f83cce24891531394a216"} Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.901392 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9kt4" event={"ID":"be82ef6f-bd03-4c9f-a760-d836fccf52a7","Type":"ContainerStarted","Data":"aee938dc4ed33beb99edd36f3fac89153eaa8eec16d34ab027c7251ddfae1e30"} Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.904170 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:46 crc kubenswrapper[4965]: E1125 15:06:46.904359 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.404332255 +0000 UTC m=+152.371926001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.904445 4965 generic.go:334] "Generic (PLEG): container finished" podID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerID="2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7" exitCode=0 Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.904638 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt5k2" event={"ID":"0316a5a7-25ae-44be-ab7e-f3499e04aa8e","Type":"ContainerDied","Data":"2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7"} Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.953052 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:46 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:46 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:46 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.953108 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:46 crc kubenswrapper[4965]: I1125 15:06:46.996151 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j8f94"] Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.007056 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.008761 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.508750904 +0000 UTC m=+152.476344650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.109399 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.109664 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.609632477 +0000 UTC m=+152.577226223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.109811 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.110151 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.61013848 +0000 UTC m=+152.577732226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.211388 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.211797 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.711779344 +0000 UTC m=+152.679373090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.312374 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.312637 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.812625455 +0000 UTC m=+152.780219201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.415622 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.416236 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:47.916219912 +0000 UTC m=+152.883813658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.518428 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.518731 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.018719119 +0000 UTC m=+152.986312865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.621594 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.621873 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.121858584 +0000 UTC m=+153.089452330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.730839 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.731258 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.23124384 +0000 UTC m=+153.198837596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.831487 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.832125 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.332107012 +0000 UTC m=+153.299700758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.933647 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:47 crc kubenswrapper[4965]: E1125 15:06:47.933920 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.43390899 +0000 UTC m=+153.401502736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.948568 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:47 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:47 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:47 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:47 crc kubenswrapper[4965]: I1125 15:06:47.948622 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.023704 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3dbf4630528bcf24e4b5849334145c93e1e08f6ece241dba5c451e71a6fde6d8"} Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.034934 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.035488 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.53545214 +0000 UTC m=+153.503045886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.036257 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.036691 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.536678114 +0000 UTC m=+153.504271860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.080827 4965 generic.go:334] "Generic (PLEG): container finished" podID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerID="b051a7237f72dc7af7c4e4be4425ef060da5785a432f83cce24891531394a216" exitCode=0 Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.080923 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9kt4" event={"ID":"be82ef6f-bd03-4c9f-a760-d836fccf52a7","Type":"ContainerDied","Data":"b051a7237f72dc7af7c4e4be4425ef060da5785a432f83cce24891531394a216"} Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.109806 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t99cv" event={"ID":"50a60857-8df9-45be-91b8-a41878677884","Type":"ContainerStarted","Data":"d02660a1c0e996394f4a5bc08c5acb6d1829cbb0c02140150749f47a9fd5bb10"} Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.136250 4965 generic.go:334] "Generic (PLEG): container finished" podID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerID="5ce18658d64d39d4f5d01dddf5a1acf245e7e956b136e9d044187290a5d4499f" exitCode=0 Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.139420 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8f94" event={"ID":"2fccc0df-85ec-4aeb-9217-00c37ea16e67","Type":"ContainerDied","Data":"5ce18658d64d39d4f5d01dddf5a1acf245e7e956b136e9d044187290a5d4499f"} Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.139462 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8f94" event={"ID":"2fccc0df-85ec-4aeb-9217-00c37ea16e67","Type":"ContainerStarted","Data":"49e3567a71ee49846a9bd218bf24649ac4e972398eba162b21a11437df0488e3"} Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.140585 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.141678 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.641655549 +0000 UTC m=+153.609249295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.161896 4965 generic.go:334] "Generic (PLEG): container finished" podID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerID="6f4ae749b08f1505ccb07a1541823399da8b6b6309573d7f4d3c0e562108c09e" exitCode=0 Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.162853 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndvs4" event={"ID":"cd34d943-ba97-48d9-bb26-cfa6d0ec549b","Type":"ContainerDied","Data":"6f4ae749b08f1505ccb07a1541823399da8b6b6309573d7f4d3c0e562108c09e"} Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.180032 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpgvl"] Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.190939 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4bd4b6a7c4eed24ea91148d9c1ae292856ce87880fc843b044e736fb7b034d21"} Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.248853 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.250336 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.750323136 +0000 UTC m=+153.717916882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.349949 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.350513 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.850491399 +0000 UTC m=+153.818085155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.451529 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.451859 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:48.951844944 +0000 UTC m=+153.919438690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.554055 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.054033943 +0000 UTC m=+154.021627689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.554084 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.554488 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.554914 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.054906477 +0000 UTC m=+154.022500223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.661731 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.662334 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.162319088 +0000 UTC m=+154.129912834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.682915 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.683206 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.685861 4965 patch_prober.go:28] interesting pod/console-f9d7485db-pdprk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.685906 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pdprk" podUID="370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.764706 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.765354 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.2653357 +0000 UTC m=+154.232929496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.852112 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.852177 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.865430 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.865746 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.365731649 +0000 UTC m=+154.333325395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.925242 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-2pf5t container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.925301 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2pf5t" podUID="b6686556-54b1-4232-a5ec-7bacd966ff86" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.925247 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-2pf5t container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.925377 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2pf5t" podUID="b6686556-54b1-4232-a5ec-7bacd966ff86" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.950798 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.958123 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:48 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:48 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:48 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.958211 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.969787 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:48 crc kubenswrapper[4965]: E1125 15:06:48.970781 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.470768745 +0000 UTC m=+154.438362481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:48 crc kubenswrapper[4965]: I1125 15:06:48.973539 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-268bv" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.088680 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.088848 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.588822379 +0000 UTC m=+154.556416125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.089252 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.089696 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.589678374 +0000 UTC m=+154.557272120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.156574 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.190788 4965 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2b4mm container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.190841 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" podUID="d3d12281-f14b-432b-bc6a-7d8d4b9e933e" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.191594 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.192193 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.69216763 +0000 UTC m=+154.659761376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.236790 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bae77fe677dc5cdba6dac33c8a65bdd41e94e6513086ea5ff0e21c2459f81c82"} Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.236831 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"47abcd0a060a681897b6c1f516c3bb8caf5245c468f5000442f2baf7ed22ac09"} Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.238653 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpgvl" event={"ID":"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0","Type":"ContainerStarted","Data":"93c3ca6634906b527d6f39994f5e1c107bd9c01654ea1dab873f1d80a6fa60fd"} Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.238681 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpgvl" event={"ID":"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0","Type":"ContainerStarted","Data":"63b03e626aec0ee035b88eb56321fe6576f4e7f07c141442ae0d5ba91331e377"} Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.285121 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.291608 4965 generic.go:334] "Generic (PLEG): container finished" podID="3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a" containerID="e20beb0854d305b63b3a390e6186a97cb6bcf36b71530a02d770432cf0f26b8c" exitCode=0 Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.291698 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" event={"ID":"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a","Type":"ContainerDied","Data":"e20beb0854d305b63b3a390e6186a97cb6bcf36b71530a02d770432cf0f26b8c"} Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.293862 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.319109 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.819092349 +0000 UTC m=+154.786686085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.356657 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dea92fd89b5d32c7583c98e90e8f36599a759e99faf595d76d8c91ee9640bd9a"} Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.357533 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.384825 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c4f84b61e2551762d0f3711902f619b06310f6632033fdd8fc3f1046efbdd43b"} Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.397005 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.397739 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:49.89772256 +0000 UTC m=+154.865316306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.487033 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5mpvj" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.498760 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.501319 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.001255905 +0000 UTC m=+154.968849651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.602816 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.603991 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.103955297 +0000 UTC m=+155.071549053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.632199 4965 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2b4mm container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.632262 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" podUID="d3d12281-f14b-432b-bc6a-7d8d4b9e933e" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.668318 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.668603 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.697590 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.705092 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.705403 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.205391325 +0000 UTC m=+155.172985071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.738247 4965 patch_prober.go:28] interesting pod/apiserver-76f77b778f-qvn4k container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]log ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]etcd ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]poststarthook/generic-apiserver-start-informers ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]poststarthook/max-in-flight-filter ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 25 15:06:49 crc kubenswrapper[4965]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 25 15:06:49 crc kubenswrapper[4965]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 25 15:06:49 crc kubenswrapper[4965]: [+]poststarthook/project.openshift.io-projectcache ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]poststarthook/openshift.io-startinformers ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 25 15:06:49 crc kubenswrapper[4965]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 25 15:06:49 crc kubenswrapper[4965]: livez check failed Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.738332 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" podUID="d55c18b9-bb80-428c-95ea-f21c6b0694e4" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.807440 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.807700 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.307655885 +0000 UTC m=+155.275249631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.807801 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.808746 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.308731964 +0000 UTC m=+155.276325710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.852865 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2b4mm" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.909134 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.909287 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.409265297 +0000 UTC m=+155.376859043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.909418 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:49 crc kubenswrapper[4965]: E1125 15:06:49.910419 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.410407849 +0000 UTC m=+155.378001595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.950812 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:49 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:49 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:49 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.950871 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:49 crc kubenswrapper[4965]: I1125 15:06:49.964024 4965 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.011487 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:50 crc kubenswrapper[4965]: E1125 15:06:50.011778 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.511763175 +0000 UTC m=+155.479356921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.114738 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:50 crc kubenswrapper[4965]: E1125 15:06:50.115178 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.615163477 +0000 UTC m=+155.582757223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.202422 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.203026 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.215854 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:50 crc kubenswrapper[4965]: E1125 15:06:50.216211 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.716197183 +0000 UTC m=+155.683790919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.221282 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.221456 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.249442 4965 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-25T15:06:49.964047533Z","Handler":null,"Name":""} Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.317424 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"52b5074a-f1cc-4f7a-b334-306c0bdf7afc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.317480 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"52b5074a-f1cc-4f7a-b334-306c0bdf7afc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.317527 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:50 crc kubenswrapper[4965]: E1125 15:06:50.317782 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.817770174 +0000 UTC m=+155.785363920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22g9m" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.325104 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.416365 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t99cv" event={"ID":"50a60857-8df9-45be-91b8-a41878677884","Type":"ContainerStarted","Data":"69b0a1abb020cad89e8a417b11051a6ec013d1401f68953ce012636c6b05993d"} Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.419586 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.419724 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"52b5074a-f1cc-4f7a-b334-306c0bdf7afc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.419770 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"52b5074a-f1cc-4f7a-b334-306c0bdf7afc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: E1125 15:06:50.420126 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:06:50.920112928 +0000 UTC m=+155.887706674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.420159 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"52b5074a-f1cc-4f7a-b334-306c0bdf7afc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.449153 4965 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.449188 4965 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.460305 4965 generic.go:334] "Generic (PLEG): container finished" podID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerID="93c3ca6634906b527d6f39994f5e1c107bd9c01654ea1dab873f1d80a6fa60fd" exitCode=0 Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.460824 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpgvl" event={"ID":"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0","Type":"ContainerDied","Data":"93c3ca6634906b527d6f39994f5e1c107bd9c01654ea1dab873f1d80a6fa60fd"} Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.485483 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6xcqz" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.521561 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.540173 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"52b5074a-f1cc-4f7a-b334-306c0bdf7afc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.540425 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.542721 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.543359 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.551218 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.551413 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.563660 4965 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.563707 4965 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.580845 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-t99cv" podStartSLOduration=14.580827434 podStartE2EDuration="14.580827434s" podCreationTimestamp="2025-11-25 15:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:50.552357412 +0000 UTC m=+155.519951158" watchObservedRunningTime="2025-11-25 15:06:50.580827434 +0000 UTC m=+155.548421180" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.607197 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.631893 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.631940 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.735388 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.735428 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.735540 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.803850 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.913024 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.951923 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:50 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:50 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:50 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:50 crc kubenswrapper[4965]: I1125 15:06:50.951981 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.134856 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22g9m\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.150900 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.292502 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.294402 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.355492 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-config-volume\") pod \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.355818 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwb8x\" (UniqueName: \"kubernetes.io/projected/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-kube-api-access-hwb8x\") pod \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.355899 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-secret-volume\") pod \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\" (UID: \"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a\") " Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.372366 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a" (UID: "3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.376174 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a" (UID: "3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.383316 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-kube-api-access-hwb8x" (OuterVolumeSpecName: "kube-api-access-hwb8x") pod "3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a" (UID: "3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a"). InnerVolumeSpecName "kube-api-access-hwb8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.465510 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwb8x\" (UniqueName: \"kubernetes.io/projected/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-kube-api-access-hwb8x\") on node \"crc\" DevicePath \"\"" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.465546 4965 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.465558 4965 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.545255 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.548374 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs" event={"ID":"3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a","Type":"ContainerDied","Data":"2a740e80da2e6c91c3f1fd75501970239d353d356f135be04cc628654076d685"} Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.548405 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a740e80da2e6c91c3f1fd75501970239d353d356f135be04cc628654076d685" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.548420 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.578826 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.645534 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 15:06:51 crc kubenswrapper[4965]: W1125 15:06:51.677204 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod52b5074a_f1cc_4f7a_b334_306c0bdf7afc.slice/crio-75068b86abcc1fdba49dcc4482440710c6a9ec8af37f11e6d1583a3f4565d636 WatchSource:0}: Error finding container 75068b86abcc1fdba49dcc4482440710c6a9ec8af37f11e6d1583a3f4565d636: Status 404 returned error can't find the container with id 75068b86abcc1fdba49dcc4482440710c6a9ec8af37f11e6d1583a3f4565d636 Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.945746 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:51 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:51 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:51 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.945802 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:51 crc kubenswrapper[4965]: I1125 15:06:51.964816 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22g9m"] Nov 25 15:06:52 crc kubenswrapper[4965]: I1125 15:06:52.583974 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"52b5074a-f1cc-4f7a-b334-306c0bdf7afc","Type":"ContainerStarted","Data":"75068b86abcc1fdba49dcc4482440710c6a9ec8af37f11e6d1583a3f4565d636"} Nov 25 15:06:52 crc kubenswrapper[4965]: I1125 15:06:52.586322 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac","Type":"ContainerStarted","Data":"3d91bc5017c56522198297db514ded6b75ff178d4407ed9bbc79d7c54d391f66"} Nov 25 15:06:52 crc kubenswrapper[4965]: I1125 15:06:52.588397 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" event={"ID":"8c017b51-468b-4ff4-9524-1e4349a54323","Type":"ContainerStarted","Data":"1d79333568b27b0f13491a19fa49c1abd9ad1ade83c0768b4f3490a657121d31"} Nov 25 15:06:52 crc kubenswrapper[4965]: I1125 15:06:52.781669 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 25 15:06:52 crc kubenswrapper[4965]: I1125 15:06:52.959583 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:52 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:52 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:52 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:52 crc kubenswrapper[4965]: I1125 15:06:52.959690 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.260655 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.260716 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.651808 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac","Type":"ContainerStarted","Data":"e0e22f3df711377511fabb57dff97c54aee4ca966f68c9292c840137c5e38d96"} Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.665211 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" event={"ID":"8c017b51-468b-4ff4-9524-1e4349a54323","Type":"ContainerStarted","Data":"f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb"} Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.665756 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.669574 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"52b5074a-f1cc-4f7a-b334-306c0bdf7afc","Type":"ContainerStarted","Data":"d2150d29e77ef58593fc744ca6ca77faa7f788b817773eae3039f26f360877d2"} Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.698960 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.6989416779999997 podStartE2EDuration="3.698941678s" podCreationTimestamp="2025-11-25 15:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:53.695409041 +0000 UTC m=+158.663002787" watchObservedRunningTime="2025-11-25 15:06:53.698941678 +0000 UTC m=+158.666535424" Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.760898 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" podStartSLOduration=136.76087461 podStartE2EDuration="2m16.76087461s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:53.757247631 +0000 UTC m=+158.724841397" watchObservedRunningTime="2025-11-25 15:06:53.76087461 +0000 UTC m=+158.728468356" Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.789280 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.78925862 podStartE2EDuration="3.78925862s" podCreationTimestamp="2025-11-25 15:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:06:53.788732646 +0000 UTC m=+158.756326392" watchObservedRunningTime="2025-11-25 15:06:53.78925862 +0000 UTC m=+158.756852366" Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.858865 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.866105 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qvn4k" Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.950783 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:53 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:53 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:53 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:53 crc kubenswrapper[4965]: I1125 15:06:53.950840 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:54 crc kubenswrapper[4965]: I1125 15:06:54.688474 4965 generic.go:334] "Generic (PLEG): container finished" podID="52b5074a-f1cc-4f7a-b334-306c0bdf7afc" containerID="d2150d29e77ef58593fc744ca6ca77faa7f788b817773eae3039f26f360877d2" exitCode=0 Nov 25 15:06:54 crc kubenswrapper[4965]: I1125 15:06:54.688745 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"52b5074a-f1cc-4f7a-b334-306c0bdf7afc","Type":"ContainerDied","Data":"d2150d29e77ef58593fc744ca6ca77faa7f788b817773eae3039f26f360877d2"} Nov 25 15:06:54 crc kubenswrapper[4965]: I1125 15:06:54.696834 4965 generic.go:334] "Generic (PLEG): container finished" podID="4b5e8804-94a3-49a1-a061-e0d5e1ace7ac" containerID="e0e22f3df711377511fabb57dff97c54aee4ca966f68c9292c840137c5e38d96" exitCode=0 Nov 25 15:06:54 crc kubenswrapper[4965]: I1125 15:06:54.697620 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac","Type":"ContainerDied","Data":"e0e22f3df711377511fabb57dff97c54aee4ca966f68c9292c840137c5e38d96"} Nov 25 15:06:54 crc kubenswrapper[4965]: I1125 15:06:54.956273 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:54 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:54 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:54 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:54 crc kubenswrapper[4965]: I1125 15:06:54.956324 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:55 crc kubenswrapper[4965]: I1125 15:06:55.026460 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mlsxj" Nov 25 15:06:55 crc kubenswrapper[4965]: I1125 15:06:55.945248 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:55 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:55 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:55 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:55 crc kubenswrapper[4965]: I1125 15:06:55.945294 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.209698 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.251617 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.372690 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kube-api-access\") pod \"52b5074a-f1cc-4f7a-b334-306c0bdf7afc\" (UID: \"52b5074a-f1cc-4f7a-b334-306c0bdf7afc\") " Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.372769 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kubelet-dir\") pod \"52b5074a-f1cc-4f7a-b334-306c0bdf7afc\" (UID: \"52b5074a-f1cc-4f7a-b334-306c0bdf7afc\") " Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.372816 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kube-api-access\") pod \"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac\" (UID: \"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac\") " Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.373483 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "52b5074a-f1cc-4f7a-b334-306c0bdf7afc" (UID: "52b5074a-f1cc-4f7a-b334-306c0bdf7afc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.375072 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4b5e8804-94a3-49a1-a061-e0d5e1ace7ac" (UID: "4b5e8804-94a3-49a1-a061-e0d5e1ace7ac"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.374783 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kubelet-dir\") pod \"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac\" (UID: \"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac\") " Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.375415 4965 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.375426 4965 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.382092 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4b5e8804-94a3-49a1-a061-e0d5e1ace7ac" (UID: "4b5e8804-94a3-49a1-a061-e0d5e1ace7ac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.383089 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "52b5074a-f1cc-4f7a-b334-306c0bdf7afc" (UID: "52b5074a-f1cc-4f7a-b334-306c0bdf7afc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.476691 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b5e8804-94a3-49a1-a061-e0d5e1ace7ac-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.476727 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52b5074a-f1cc-4f7a-b334-306c0bdf7afc-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.732103 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4b5e8804-94a3-49a1-a061-e0d5e1ace7ac","Type":"ContainerDied","Data":"3d91bc5017c56522198297db514ded6b75ff178d4407ed9bbc79d7c54d391f66"} Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.732147 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d91bc5017c56522198297db514ded6b75ff178d4407ed9bbc79d7c54d391f66" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.732228 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.750221 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"52b5074a-f1cc-4f7a-b334-306c0bdf7afc","Type":"ContainerDied","Data":"75068b86abcc1fdba49dcc4482440710c6a9ec8af37f11e6d1583a3f4565d636"} Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.751313 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75068b86abcc1fdba49dcc4482440710c6a9ec8af37f11e6d1583a3f4565d636" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.751501 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.951206 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:56 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:56 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:56 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:56 crc kubenswrapper[4965]: I1125 15:06:56.951274 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:57 crc kubenswrapper[4965]: I1125 15:06:57.944747 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:57 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:57 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:57 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:57 crc kubenswrapper[4965]: I1125 15:06:57.945030 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:58 crc kubenswrapper[4965]: I1125 15:06:58.684656 4965 patch_prober.go:28] interesting pod/console-f9d7485db-pdprk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Nov 25 15:06:58 crc kubenswrapper[4965]: I1125 15:06:58.684724 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pdprk" podUID="370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Nov 25 15:06:58 crc kubenswrapper[4965]: I1125 15:06:58.926801 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-2pf5t container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 25 15:06:58 crc kubenswrapper[4965]: I1125 15:06:58.926980 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2pf5t" podUID="b6686556-54b1-4232-a5ec-7bacd966ff86" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 25 15:06:58 crc kubenswrapper[4965]: I1125 15:06:58.930617 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-2pf5t container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 25 15:06:58 crc kubenswrapper[4965]: I1125 15:06:58.930937 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2pf5t" podUID="b6686556-54b1-4232-a5ec-7bacd966ff86" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 25 15:06:58 crc kubenswrapper[4965]: I1125 15:06:58.949106 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:58 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:58 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:58 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:58 crc kubenswrapper[4965]: I1125 15:06:58.949186 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:06:59 crc kubenswrapper[4965]: I1125 15:06:59.945002 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:06:59 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Nov 25 15:06:59 crc kubenswrapper[4965]: [+]process-running ok Nov 25 15:06:59 crc kubenswrapper[4965]: healthz check failed Nov 25 15:06:59 crc kubenswrapper[4965]: I1125 15:06:59.945267 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:07:00 crc kubenswrapper[4965]: I1125 15:07:00.149350 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:07:00 crc kubenswrapper[4965]: I1125 15:07:00.154686 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed72551-610b-4f03-8a57-319ef27e27e0-metrics-certs\") pod \"network-metrics-daemon-j87z5\" (UID: \"6ed72551-610b-4f03-8a57-319ef27e27e0\") " pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:07:00 crc kubenswrapper[4965]: I1125 15:07:00.406622 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j87z5" Nov 25 15:07:00 crc kubenswrapper[4965]: I1125 15:07:00.946104 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:07:00 crc kubenswrapper[4965]: I1125 15:07:00.949623 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-82czk" Nov 25 15:07:08 crc kubenswrapper[4965]: I1125 15:07:08.760647 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:07:08 crc kubenswrapper[4965]: I1125 15:07:08.786895 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:07:08 crc kubenswrapper[4965]: I1125 15:07:08.931478 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2pf5t" Nov 25 15:07:11 crc kubenswrapper[4965]: I1125 15:07:11.300061 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:07:18 crc kubenswrapper[4965]: I1125 15:07:18.970784 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2t52f" Nov 25 15:07:23 crc kubenswrapper[4965]: I1125 15:07:23.261252 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:07:23 crc kubenswrapper[4965]: I1125 15:07:23.261670 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:07:25 crc kubenswrapper[4965]: I1125 15:07:25.801501 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.423173 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 15:07:34 crc kubenswrapper[4965]: E1125 15:07:34.423915 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a" containerName="collect-profiles" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.423929 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a" containerName="collect-profiles" Nov 25 15:07:34 crc kubenswrapper[4965]: E1125 15:07:34.423941 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5e8804-94a3-49a1-a061-e0d5e1ace7ac" containerName="pruner" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.423949 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5e8804-94a3-49a1-a061-e0d5e1ace7ac" containerName="pruner" Nov 25 15:07:34 crc kubenswrapper[4965]: E1125 15:07:34.423959 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b5074a-f1cc-4f7a-b334-306c0bdf7afc" containerName="pruner" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.423986 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b5074a-f1cc-4f7a-b334-306c0bdf7afc" containerName="pruner" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.424099 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b5e8804-94a3-49a1-a061-e0d5e1ace7ac" containerName="pruner" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.424119 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a" containerName="collect-profiles" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.424129 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b5074a-f1cc-4f7a-b334-306c0bdf7afc" containerName="pruner" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.424544 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.428312 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.428616 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.445528 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.506910 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98f014b-3c55-409a-9d2c-4fc3318bf200-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d98f014b-3c55-409a-9d2c-4fc3318bf200\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.507013 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98f014b-3c55-409a-9d2c-4fc3318bf200-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d98f014b-3c55-409a-9d2c-4fc3318bf200\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.608673 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98f014b-3c55-409a-9d2c-4fc3318bf200-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d98f014b-3c55-409a-9d2c-4fc3318bf200\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.608769 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98f014b-3c55-409a-9d2c-4fc3318bf200-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d98f014b-3c55-409a-9d2c-4fc3318bf200\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.608844 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98f014b-3c55-409a-9d2c-4fc3318bf200-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d98f014b-3c55-409a-9d2c-4fc3318bf200\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.638593 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98f014b-3c55-409a-9d2c-4fc3318bf200-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d98f014b-3c55-409a-9d2c-4fc3318bf200\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 15:07:34 crc kubenswrapper[4965]: I1125 15:07:34.750637 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 15:07:37 crc kubenswrapper[4965]: E1125 15:07:37.546889 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 25 15:07:37 crc kubenswrapper[4965]: E1125 15:07:37.547856 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wwgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-j8f94_openshift-marketplace(2fccc0df-85ec-4aeb-9217-00c37ea16e67): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:07:37 crc kubenswrapper[4965]: E1125 15:07:37.549110 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-j8f94" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" Nov 25 15:07:38 crc kubenswrapper[4965]: E1125 15:07:38.648893 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 25 15:07:38 crc kubenswrapper[4965]: E1125 15:07:38.649066 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxhdz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kpgvl_openshift-marketplace(7b6bd509-fa4d-47c9-a0ca-62eac6f717a0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:07:38 crc kubenswrapper[4965]: E1125 15:07:38.650322 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kpgvl" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.435617 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.437205 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.438340 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.470238 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a927b1ca-0619-4f2f-89bd-5583da792645-kube-api-access\") pod \"installer-9-crc\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.470537 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-var-lock\") pod \"installer-9-crc\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.470702 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.571820 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.571883 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a927b1ca-0619-4f2f-89bd-5583da792645-kube-api-access\") pod \"installer-9-crc\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.571920 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-var-lock\") pod \"installer-9-crc\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.572068 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.572424 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-var-lock\") pod \"installer-9-crc\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.606848 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a927b1ca-0619-4f2f-89bd-5583da792645-kube-api-access\") pod \"installer-9-crc\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:39 crc kubenswrapper[4965]: I1125 15:07:39.761486 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:07:44 crc kubenswrapper[4965]: E1125 15:07:44.043406 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kpgvl" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" Nov 25 15:07:44 crc kubenswrapper[4965]: E1125 15:07:44.044078 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-j8f94" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" Nov 25 15:07:44 crc kubenswrapper[4965]: E1125 15:07:44.155305 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 25 15:07:44 crc kubenswrapper[4965]: E1125 15:07:44.155600 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nvmjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-s9jt9_openshift-marketplace(1269cb10-777f-46e4-a52f-8088e7af6b2d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:07:44 crc kubenswrapper[4965]: E1125 15:07:44.156730 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-s9jt9" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" Nov 25 15:07:44 crc kubenswrapper[4965]: E1125 15:07:44.165404 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 25 15:07:44 crc kubenswrapper[4965]: E1125 15:07:44.165532 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bnvk6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-fvgv6_openshift-marketplace(d950b336-b79c-4b02-a695-66f4757027ca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:07:44 crc kubenswrapper[4965]: E1125 15:07:44.167471 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-fvgv6" podUID="d950b336-b79c-4b02-a695-66f4757027ca" Nov 25 15:07:45 crc kubenswrapper[4965]: E1125 15:07:45.234419 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-s9jt9" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" Nov 25 15:07:45 crc kubenswrapper[4965]: E1125 15:07:45.234421 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-fvgv6" podUID="d950b336-b79c-4b02-a695-66f4757027ca" Nov 25 15:07:45 crc kubenswrapper[4965]: E1125 15:07:45.312586 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 25 15:07:45 crc kubenswrapper[4965]: E1125 15:07:45.312749 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhvtz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-g9kt4_openshift-marketplace(be82ef6f-bd03-4c9f-a760-d836fccf52a7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:07:45 crc kubenswrapper[4965]: E1125 15:07:45.313993 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-g9kt4" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" Nov 25 15:07:45 crc kubenswrapper[4965]: E1125 15:07:45.314246 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 25 15:07:45 crc kubenswrapper[4965]: E1125 15:07:45.314380 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs9tv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ndvs4_openshift-marketplace(cd34d943-ba97-48d9-bb26-cfa6d0ec549b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:07:45 crc kubenswrapper[4965]: E1125 15:07:45.315521 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ndvs4" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" Nov 25 15:07:46 crc kubenswrapper[4965]: E1125 15:07:46.874596 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ndvs4" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" Nov 25 15:07:46 crc kubenswrapper[4965]: E1125 15:07:46.874736 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-g9kt4" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" Nov 25 15:07:46 crc kubenswrapper[4965]: E1125 15:07:46.930570 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 25 15:07:46 crc kubenswrapper[4965]: E1125 15:07:46.930729 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qh2g4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zt5k2_openshift-marketplace(0316a5a7-25ae-44be-ab7e-f3499e04aa8e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:07:46 crc kubenswrapper[4965]: E1125 15:07:46.932902 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zt5k2" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" Nov 25 15:07:46 crc kubenswrapper[4965]: E1125 15:07:46.948049 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 25 15:07:46 crc kubenswrapper[4965]: E1125 15:07:46.948245 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xpjkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6blkb_openshift-marketplace(79a7c128-3c65-491c-95b6-52337183df64): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:07:46 crc kubenswrapper[4965]: E1125 15:07:46.949620 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6blkb" podUID="79a7c128-3c65-491c-95b6-52337183df64" Nov 25 15:07:47 crc kubenswrapper[4965]: I1125 15:07:47.094569 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j87z5"] Nov 25 15:07:47 crc kubenswrapper[4965]: E1125 15:07:47.096436 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6blkb" podUID="79a7c128-3c65-491c-95b6-52337183df64" Nov 25 15:07:47 crc kubenswrapper[4965]: E1125 15:07:47.096855 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zt5k2" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" Nov 25 15:07:47 crc kubenswrapper[4965]: I1125 15:07:47.382508 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 15:07:47 crc kubenswrapper[4965]: I1125 15:07:47.396357 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 15:07:48 crc kubenswrapper[4965]: I1125 15:07:48.104529 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a927b1ca-0619-4f2f-89bd-5583da792645","Type":"ContainerStarted","Data":"61ea41873abba05582e0d11a94923835c7bd41458625f5e7b926cb2b917f323d"} Nov 25 15:07:48 crc kubenswrapper[4965]: I1125 15:07:48.105007 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a927b1ca-0619-4f2f-89bd-5583da792645","Type":"ContainerStarted","Data":"cd7a1e4fe723a5e8cdf73b2685663f4bbab6fb26c309107c15d37b0748e1da91"} Nov 25 15:07:48 crc kubenswrapper[4965]: I1125 15:07:48.107533 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d98f014b-3c55-409a-9d2c-4fc3318bf200","Type":"ContainerStarted","Data":"12dfe1e378fd04ddbd5fb1652fb663841d55044c78059ec7ad7b23d07be768ea"} Nov 25 15:07:48 crc kubenswrapper[4965]: I1125 15:07:48.107594 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d98f014b-3c55-409a-9d2c-4fc3318bf200","Type":"ContainerStarted","Data":"244e8959ed960a6985569d1291825c8dd35af812aa0caae51ebb1a050b9f7361"} Nov 25 15:07:48 crc kubenswrapper[4965]: I1125 15:07:48.112149 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j87z5" event={"ID":"6ed72551-610b-4f03-8a57-319ef27e27e0","Type":"ContainerStarted","Data":"ab3433e01194c609534b30761075c816638c25c4c5e0d362d9813445eac5016a"} Nov 25 15:07:48 crc kubenswrapper[4965]: I1125 15:07:48.112178 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j87z5" event={"ID":"6ed72551-610b-4f03-8a57-319ef27e27e0","Type":"ContainerStarted","Data":"1f505319e58bd0128a55e2a954bb536653b42cd13e8059e80c33fdca5f82d20a"} Nov 25 15:07:48 crc kubenswrapper[4965]: I1125 15:07:48.112187 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j87z5" event={"ID":"6ed72551-610b-4f03-8a57-319ef27e27e0","Type":"ContainerStarted","Data":"8f09a8f8392bcc88943f7e33619c94aed1f04f09c985315519ab6ed99d299bf2"} Nov 25 15:07:48 crc kubenswrapper[4965]: I1125 15:07:48.124883 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=9.124864932 podStartE2EDuration="9.124864932s" podCreationTimestamp="2025-11-25 15:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:07:48.123518006 +0000 UTC m=+213.091111762" watchObservedRunningTime="2025-11-25 15:07:48.124864932 +0000 UTC m=+213.092458688" Nov 25 15:07:48 crc kubenswrapper[4965]: I1125 15:07:48.149280 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j87z5" podStartSLOduration=191.149262334 podStartE2EDuration="3m11.149262334s" podCreationTimestamp="2025-11-25 15:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:07:48.147471535 +0000 UTC m=+213.115065281" watchObservedRunningTime="2025-11-25 15:07:48.149262334 +0000 UTC m=+213.116856080" Nov 25 15:07:48 crc kubenswrapper[4965]: I1125 15:07:48.170233 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=14.170219103 podStartE2EDuration="14.170219103s" podCreationTimestamp="2025-11-25 15:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:07:48.167090857 +0000 UTC m=+213.134684603" watchObservedRunningTime="2025-11-25 15:07:48.170219103 +0000 UTC m=+213.137812849" Nov 25 15:07:49 crc kubenswrapper[4965]: I1125 15:07:49.118592 4965 generic.go:334] "Generic (PLEG): container finished" podID="d98f014b-3c55-409a-9d2c-4fc3318bf200" containerID="12dfe1e378fd04ddbd5fb1652fb663841d55044c78059ec7ad7b23d07be768ea" exitCode=0 Nov 25 15:07:49 crc kubenswrapper[4965]: I1125 15:07:49.118688 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d98f014b-3c55-409a-9d2c-4fc3318bf200","Type":"ContainerDied","Data":"12dfe1e378fd04ddbd5fb1652fb663841d55044c78059ec7ad7b23d07be768ea"} Nov 25 15:07:50 crc kubenswrapper[4965]: I1125 15:07:50.348217 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 15:07:50 crc kubenswrapper[4965]: I1125 15:07:50.510915 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98f014b-3c55-409a-9d2c-4fc3318bf200-kube-api-access\") pod \"d98f014b-3c55-409a-9d2c-4fc3318bf200\" (UID: \"d98f014b-3c55-409a-9d2c-4fc3318bf200\") " Nov 25 15:07:50 crc kubenswrapper[4965]: I1125 15:07:50.511379 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98f014b-3c55-409a-9d2c-4fc3318bf200-kubelet-dir\") pod \"d98f014b-3c55-409a-9d2c-4fc3318bf200\" (UID: \"d98f014b-3c55-409a-9d2c-4fc3318bf200\") " Nov 25 15:07:50 crc kubenswrapper[4965]: I1125 15:07:50.511569 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d98f014b-3c55-409a-9d2c-4fc3318bf200-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d98f014b-3c55-409a-9d2c-4fc3318bf200" (UID: "d98f014b-3c55-409a-9d2c-4fc3318bf200"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:07:50 crc kubenswrapper[4965]: I1125 15:07:50.516849 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98f014b-3c55-409a-9d2c-4fc3318bf200-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d98f014b-3c55-409a-9d2c-4fc3318bf200" (UID: "d98f014b-3c55-409a-9d2c-4fc3318bf200"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:07:50 crc kubenswrapper[4965]: I1125 15:07:50.612596 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98f014b-3c55-409a-9d2c-4fc3318bf200-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:07:50 crc kubenswrapper[4965]: I1125 15:07:50.612621 4965 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98f014b-3c55-409a-9d2c-4fc3318bf200-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:07:51 crc kubenswrapper[4965]: I1125 15:07:51.133717 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d98f014b-3c55-409a-9d2c-4fc3318bf200","Type":"ContainerDied","Data":"244e8959ed960a6985569d1291825c8dd35af812aa0caae51ebb1a050b9f7361"} Nov 25 15:07:51 crc kubenswrapper[4965]: I1125 15:07:51.133770 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="244e8959ed960a6985569d1291825c8dd35af812aa0caae51ebb1a050b9f7361" Nov 25 15:07:51 crc kubenswrapper[4965]: I1125 15:07:51.133790 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 15:07:53 crc kubenswrapper[4965]: I1125 15:07:53.260298 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:07:53 crc kubenswrapper[4965]: I1125 15:07:53.260698 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:07:53 crc kubenswrapper[4965]: I1125 15:07:53.260771 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:07:53 crc kubenswrapper[4965]: I1125 15:07:53.261490 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:07:53 crc kubenswrapper[4965]: I1125 15:07:53.261632 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e" gracePeriod=600 Nov 25 15:07:54 crc kubenswrapper[4965]: I1125 15:07:54.162794 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e" exitCode=0 Nov 25 15:07:54 crc kubenswrapper[4965]: I1125 15:07:54.163055 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e"} Nov 25 15:07:54 crc kubenswrapper[4965]: I1125 15:07:54.163184 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"5219e353dd210900b2aa5e32adf4b960a9f3443e7b6f3437667737ce403d5782"} Nov 25 15:07:57 crc kubenswrapper[4965]: I1125 15:07:57.179450 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpgvl" event={"ID":"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0","Type":"ContainerStarted","Data":"92ea833181d5ffbb43d432f0df423f7b0dd9c18be77e5a840f62ea941fb7e618"} Nov 25 15:07:57 crc kubenswrapper[4965]: I1125 15:07:57.182056 4965 generic.go:334] "Generic (PLEG): container finished" podID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerID="6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7" exitCode=0 Nov 25 15:07:57 crc kubenswrapper[4965]: I1125 15:07:57.182095 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9jt9" event={"ID":"1269cb10-777f-46e4-a52f-8088e7af6b2d","Type":"ContainerDied","Data":"6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7"} Nov 25 15:07:58 crc kubenswrapper[4965]: I1125 15:07:58.191484 4965 generic.go:334] "Generic (PLEG): container finished" podID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerID="92ea833181d5ffbb43d432f0df423f7b0dd9c18be77e5a840f62ea941fb7e618" exitCode=0 Nov 25 15:07:58 crc kubenswrapper[4965]: I1125 15:07:58.191822 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpgvl" event={"ID":"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0","Type":"ContainerDied","Data":"92ea833181d5ffbb43d432f0df423f7b0dd9c18be77e5a840f62ea941fb7e618"} Nov 25 15:07:58 crc kubenswrapper[4965]: I1125 15:07:58.197375 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9jt9" event={"ID":"1269cb10-777f-46e4-a52f-8088e7af6b2d","Type":"ContainerStarted","Data":"cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456"} Nov 25 15:07:58 crc kubenswrapper[4965]: I1125 15:07:58.244047 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s9jt9" podStartSLOduration=5.539199917 podStartE2EDuration="1m16.244032167s" podCreationTimestamp="2025-11-25 15:06:42 +0000 UTC" firstStartedPulling="2025-11-25 15:06:46.88232905 +0000 UTC m=+151.849922796" lastFinishedPulling="2025-11-25 15:07:57.5871613 +0000 UTC m=+222.554755046" observedRunningTime="2025-11-25 15:07:58.240756147 +0000 UTC m=+223.208349893" watchObservedRunningTime="2025-11-25 15:07:58.244032167 +0000 UTC m=+223.211625903" Nov 25 15:07:58 crc kubenswrapper[4965]: I1125 15:07:58.606922 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fjgbs"] Nov 25 15:07:59 crc kubenswrapper[4965]: I1125 15:07:59.227567 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpgvl" event={"ID":"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0","Type":"ContainerStarted","Data":"c291409478cc3d7ea1c6099bdd5fd1b0a588fbb5dcb45dc6d4a70423411c8b85"} Nov 25 15:07:59 crc kubenswrapper[4965]: I1125 15:07:59.230859 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8f94" event={"ID":"2fccc0df-85ec-4aeb-9217-00c37ea16e67","Type":"ContainerStarted","Data":"fc5208ff388cc5be0685cc6826bf2fc28fa93337d185fc4750b6819994cb1687"} Nov 25 15:07:59 crc kubenswrapper[4965]: I1125 15:07:59.236032 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvgv6" event={"ID":"d950b336-b79c-4b02-a695-66f4757027ca","Type":"ContainerStarted","Data":"63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc"} Nov 25 15:07:59 crc kubenswrapper[4965]: I1125 15:07:59.264305 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kpgvl" podStartSLOduration=4.905021795 podStartE2EDuration="1m14.264290815s" podCreationTimestamp="2025-11-25 15:06:45 +0000 UTC" firstStartedPulling="2025-11-25 15:06:49.248144738 +0000 UTC m=+154.215738484" lastFinishedPulling="2025-11-25 15:07:58.607413768 +0000 UTC m=+223.575007504" observedRunningTime="2025-11-25 15:07:59.263548724 +0000 UTC m=+224.231142470" watchObservedRunningTime="2025-11-25 15:07:59.264290815 +0000 UTC m=+224.231884561" Nov 25 15:08:00 crc kubenswrapper[4965]: I1125 15:08:00.241791 4965 generic.go:334] "Generic (PLEG): container finished" podID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerID="fc5208ff388cc5be0685cc6826bf2fc28fa93337d185fc4750b6819994cb1687" exitCode=0 Nov 25 15:08:00 crc kubenswrapper[4965]: I1125 15:08:00.241847 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8f94" event={"ID":"2fccc0df-85ec-4aeb-9217-00c37ea16e67","Type":"ContainerDied","Data":"fc5208ff388cc5be0685cc6826bf2fc28fa93337d185fc4750b6819994cb1687"} Nov 25 15:08:00 crc kubenswrapper[4965]: I1125 15:08:00.244351 4965 generic.go:334] "Generic (PLEG): container finished" podID="d950b336-b79c-4b02-a695-66f4757027ca" containerID="63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc" exitCode=0 Nov 25 15:08:00 crc kubenswrapper[4965]: I1125 15:08:00.244375 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvgv6" event={"ID":"d950b336-b79c-4b02-a695-66f4757027ca","Type":"ContainerDied","Data":"63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc"} Nov 25 15:08:01 crc kubenswrapper[4965]: I1125 15:08:01.251502 4965 generic.go:334] "Generic (PLEG): container finished" podID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerID="3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b" exitCode=0 Nov 25 15:08:01 crc kubenswrapper[4965]: I1125 15:08:01.251859 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt5k2" event={"ID":"0316a5a7-25ae-44be-ab7e-f3499e04aa8e","Type":"ContainerDied","Data":"3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b"} Nov 25 15:08:01 crc kubenswrapper[4965]: I1125 15:08:01.254238 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8f94" event={"ID":"2fccc0df-85ec-4aeb-9217-00c37ea16e67","Type":"ContainerStarted","Data":"8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928"} Nov 25 15:08:02 crc kubenswrapper[4965]: I1125 15:08:02.285086 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j8f94" podStartSLOduration=4.768427602 podStartE2EDuration="1m17.285067302s" podCreationTimestamp="2025-11-25 15:06:45 +0000 UTC" firstStartedPulling="2025-11-25 15:06:48.160068945 +0000 UTC m=+153.127662691" lastFinishedPulling="2025-11-25 15:08:00.676708645 +0000 UTC m=+225.644302391" observedRunningTime="2025-11-25 15:08:02.282017019 +0000 UTC m=+227.249610765" watchObservedRunningTime="2025-11-25 15:08:02.285067302 +0000 UTC m=+227.252661048" Nov 25 15:08:02 crc kubenswrapper[4965]: I1125 15:08:02.465564 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:08:02 crc kubenswrapper[4965]: I1125 15:08:02.465613 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:08:03 crc kubenswrapper[4965]: I1125 15:08:03.224739 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:08:03 crc kubenswrapper[4965]: I1125 15:08:03.310316 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:08:05 crc kubenswrapper[4965]: I1125 15:08:05.452386 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:08:05 crc kubenswrapper[4965]: I1125 15:08:05.453249 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:08:05 crc kubenswrapper[4965]: I1125 15:08:05.874409 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:08:05 crc kubenswrapper[4965]: I1125 15:08:05.874445 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:08:05 crc kubenswrapper[4965]: I1125 15:08:05.910733 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:08:06 crc kubenswrapper[4965]: I1125 15:08:06.320892 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:08:06 crc kubenswrapper[4965]: I1125 15:08:06.500165 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j8f94" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerName="registry-server" probeResult="failure" output=< Nov 25 15:08:06 crc kubenswrapper[4965]: timeout: failed to connect service ":50051" within 1s Nov 25 15:08:06 crc kubenswrapper[4965]: > Nov 25 15:08:07 crc kubenswrapper[4965]: I1125 15:08:07.000691 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kpgvl"] Nov 25 15:08:08 crc kubenswrapper[4965]: I1125 15:08:08.294189 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kpgvl" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerName="registry-server" containerID="cri-o://c291409478cc3d7ea1c6099bdd5fd1b0a588fbb5dcb45dc6d4a70423411c8b85" gracePeriod=2 Nov 25 15:08:10 crc kubenswrapper[4965]: I1125 15:08:10.309851 4965 generic.go:334] "Generic (PLEG): container finished" podID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerID="c291409478cc3d7ea1c6099bdd5fd1b0a588fbb5dcb45dc6d4a70423411c8b85" exitCode=0 Nov 25 15:08:10 crc kubenswrapper[4965]: I1125 15:08:10.309901 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpgvl" event={"ID":"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0","Type":"ContainerDied","Data":"c291409478cc3d7ea1c6099bdd5fd1b0a588fbb5dcb45dc6d4a70423411c8b85"} Nov 25 15:08:12 crc kubenswrapper[4965]: I1125 15:08:12.663812 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:08:12 crc kubenswrapper[4965]: I1125 15:08:12.742369 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxhdz\" (UniqueName: \"kubernetes.io/projected/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-kube-api-access-cxhdz\") pod \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " Nov 25 15:08:12 crc kubenswrapper[4965]: I1125 15:08:12.742437 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-catalog-content\") pod \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " Nov 25 15:08:12 crc kubenswrapper[4965]: I1125 15:08:12.742493 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-utilities\") pod \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\" (UID: \"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0\") " Nov 25 15:08:12 crc kubenswrapper[4965]: I1125 15:08:12.743588 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-utilities" (OuterVolumeSpecName: "utilities") pod "7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" (UID: "7b6bd509-fa4d-47c9-a0ca-62eac6f717a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:08:12 crc kubenswrapper[4965]: I1125 15:08:12.753139 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-kube-api-access-cxhdz" (OuterVolumeSpecName: "kube-api-access-cxhdz") pod "7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" (UID: "7b6bd509-fa4d-47c9-a0ca-62eac6f717a0"). InnerVolumeSpecName "kube-api-access-cxhdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:08:12 crc kubenswrapper[4965]: I1125 15:08:12.844609 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:12 crc kubenswrapper[4965]: I1125 15:08:12.844648 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxhdz\" (UniqueName: \"kubernetes.io/projected/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-kube-api-access-cxhdz\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:13 crc kubenswrapper[4965]: I1125 15:08:13.125300 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" (UID: "7b6bd509-fa4d-47c9-a0ca-62eac6f717a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:08:13 crc kubenswrapper[4965]: I1125 15:08:13.149351 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:13 crc kubenswrapper[4965]: I1125 15:08:13.331500 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpgvl" Nov 25 15:08:13 crc kubenswrapper[4965]: I1125 15:08:13.331494 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpgvl" event={"ID":"7b6bd509-fa4d-47c9-a0ca-62eac6f717a0","Type":"ContainerDied","Data":"63b03e626aec0ee035b88eb56321fe6576f4e7f07c141442ae0d5ba91331e377"} Nov 25 15:08:13 crc kubenswrapper[4965]: I1125 15:08:13.331699 4965 scope.go:117] "RemoveContainer" containerID="c291409478cc3d7ea1c6099bdd5fd1b0a588fbb5dcb45dc6d4a70423411c8b85" Nov 25 15:08:13 crc kubenswrapper[4965]: I1125 15:08:13.338060 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvgv6" event={"ID":"d950b336-b79c-4b02-a695-66f4757027ca","Type":"ContainerStarted","Data":"2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d"} Nov 25 15:08:13 crc kubenswrapper[4965]: I1125 15:08:13.359947 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kpgvl"] Nov 25 15:08:13 crc kubenswrapper[4965]: I1125 15:08:13.366381 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kpgvl"] Nov 25 15:08:14 crc kubenswrapper[4965]: I1125 15:08:14.361229 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fvgv6" podStartSLOduration=17.127558604 podStartE2EDuration="1m32.361211945s" podCreationTimestamp="2025-11-25 15:06:42 +0000 UTC" firstStartedPulling="2025-11-25 15:06:46.889097836 +0000 UTC m=+151.856691582" lastFinishedPulling="2025-11-25 15:08:02.122751177 +0000 UTC m=+227.090344923" observedRunningTime="2025-11-25 15:08:14.357958657 +0000 UTC m=+239.325552413" watchObservedRunningTime="2025-11-25 15:08:14.361211945 +0000 UTC m=+239.328805701" Nov 25 15:08:14 crc kubenswrapper[4965]: I1125 15:08:14.471306 4965 scope.go:117] "RemoveContainer" containerID="92ea833181d5ffbb43d432f0df423f7b0dd9c18be77e5a840f62ea941fb7e618" Nov 25 15:08:14 crc kubenswrapper[4965]: I1125 15:08:14.525427 4965 scope.go:117] "RemoveContainer" containerID="93c3ca6634906b527d6f39994f5e1c107bd9c01654ea1dab873f1d80a6fa60fd" Nov 25 15:08:14 crc kubenswrapper[4965]: I1125 15:08:14.778086 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" path="/var/lib/kubelet/pods/7b6bd509-fa4d-47c9-a0ca-62eac6f717a0/volumes" Nov 25 15:08:15 crc kubenswrapper[4965]: I1125 15:08:15.349467 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndvs4" event={"ID":"cd34d943-ba97-48d9-bb26-cfa6d0ec549b","Type":"ContainerStarted","Data":"6bf2466eea8ca0d1c05f8530a35593f71a2ee6143b05dd6414baa1ccdd97e99e"} Nov 25 15:08:15 crc kubenswrapper[4965]: I1125 15:08:15.352486 4965 generic.go:334] "Generic (PLEG): container finished" podID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerID="f1f9b22925286490d960cc36b65af873cab76e846d20482426e9c7ace788718e" exitCode=0 Nov 25 15:08:15 crc kubenswrapper[4965]: I1125 15:08:15.352573 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9kt4" event={"ID":"be82ef6f-bd03-4c9f-a760-d836fccf52a7","Type":"ContainerDied","Data":"f1f9b22925286490d960cc36b65af873cab76e846d20482426e9c7ace788718e"} Nov 25 15:08:15 crc kubenswrapper[4965]: I1125 15:08:15.357641 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt5k2" event={"ID":"0316a5a7-25ae-44be-ab7e-f3499e04aa8e","Type":"ContainerStarted","Data":"e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27"} Nov 25 15:08:15 crc kubenswrapper[4965]: I1125 15:08:15.365922 4965 generic.go:334] "Generic (PLEG): container finished" podID="79a7c128-3c65-491c-95b6-52337183df64" containerID="9ede19fafee6f4cdb2ebc2de199584e8d5f9b0261394e33451d045f40b3c6e67" exitCode=0 Nov 25 15:08:15 crc kubenswrapper[4965]: I1125 15:08:15.365960 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6blkb" event={"ID":"79a7c128-3c65-491c-95b6-52337183df64","Type":"ContainerDied","Data":"9ede19fafee6f4cdb2ebc2de199584e8d5f9b0261394e33451d045f40b3c6e67"} Nov 25 15:08:15 crc kubenswrapper[4965]: I1125 15:08:15.401325 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zt5k2" podStartSLOduration=5.871597633 podStartE2EDuration="1m33.401303951s" podCreationTimestamp="2025-11-25 15:06:42 +0000 UTC" firstStartedPulling="2025-11-25 15:06:46.928218991 +0000 UTC m=+151.895812737" lastFinishedPulling="2025-11-25 15:08:14.457925269 +0000 UTC m=+239.425519055" observedRunningTime="2025-11-25 15:08:15.398379172 +0000 UTC m=+240.365972928" watchObservedRunningTime="2025-11-25 15:08:15.401303951 +0000 UTC m=+240.368897707" Nov 25 15:08:15 crc kubenswrapper[4965]: I1125 15:08:15.489584 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:08:15 crc kubenswrapper[4965]: I1125 15:08:15.539976 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:08:16 crc kubenswrapper[4965]: I1125 15:08:16.373622 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9kt4" event={"ID":"be82ef6f-bd03-4c9f-a760-d836fccf52a7","Type":"ContainerStarted","Data":"2f1eeae762f660161d50f777adf757128f6476ed38191443e3367c46659183de"} Nov 25 15:08:16 crc kubenswrapper[4965]: I1125 15:08:16.375627 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6blkb" event={"ID":"79a7c128-3c65-491c-95b6-52337183df64","Type":"ContainerStarted","Data":"519627fa84fe75bca558edb39f2f6428755a1f9c4bd70537fc1675d218ba1059"} Nov 25 15:08:16 crc kubenswrapper[4965]: I1125 15:08:16.377160 4965 generic.go:334] "Generic (PLEG): container finished" podID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerID="6bf2466eea8ca0d1c05f8530a35593f71a2ee6143b05dd6414baa1ccdd97e99e" exitCode=0 Nov 25 15:08:16 crc kubenswrapper[4965]: I1125 15:08:16.377224 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndvs4" event={"ID":"cd34d943-ba97-48d9-bb26-cfa6d0ec549b","Type":"ContainerDied","Data":"6bf2466eea8ca0d1c05f8530a35593f71a2ee6143b05dd6414baa1ccdd97e99e"} Nov 25 15:08:16 crc kubenswrapper[4965]: I1125 15:08:16.397399 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g9kt4" podStartSLOduration=4.514218867 podStartE2EDuration="1m32.397381313s" podCreationTimestamp="2025-11-25 15:06:44 +0000 UTC" firstStartedPulling="2025-11-25 15:06:48.160436985 +0000 UTC m=+153.128030731" lastFinishedPulling="2025-11-25 15:08:16.043599421 +0000 UTC m=+241.011193177" observedRunningTime="2025-11-25 15:08:16.395623115 +0000 UTC m=+241.363216861" watchObservedRunningTime="2025-11-25 15:08:16.397381313 +0000 UTC m=+241.364975059" Nov 25 15:08:16 crc kubenswrapper[4965]: I1125 15:08:16.414069 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6blkb" podStartSLOduration=6.379495074 podStartE2EDuration="1m35.414049395s" podCreationTimestamp="2025-11-25 15:06:41 +0000 UTC" firstStartedPulling="2025-11-25 15:06:46.888799177 +0000 UTC m=+151.856392923" lastFinishedPulling="2025-11-25 15:08:15.923353498 +0000 UTC m=+240.890947244" observedRunningTime="2025-11-25 15:08:16.411392933 +0000 UTC m=+241.378986679" watchObservedRunningTime="2025-11-25 15:08:16.414049395 +0000 UTC m=+241.381643141" Nov 25 15:08:17 crc kubenswrapper[4965]: I1125 15:08:17.383473 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndvs4" event={"ID":"cd34d943-ba97-48d9-bb26-cfa6d0ec549b","Type":"ContainerStarted","Data":"9fd377b309a2407202a91c7b56c3be8010cf1afe85c1deaa6821aa69f5199dab"} Nov 25 15:08:17 crc kubenswrapper[4965]: I1125 15:08:17.402375 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ndvs4" podStartSLOduration=5.814322004 podStartE2EDuration="1m34.402356185s" podCreationTimestamp="2025-11-25 15:06:43 +0000 UTC" firstStartedPulling="2025-11-25 15:06:48.173031852 +0000 UTC m=+153.140625588" lastFinishedPulling="2025-11-25 15:08:16.761066033 +0000 UTC m=+241.728659769" observedRunningTime="2025-11-25 15:08:17.401171244 +0000 UTC m=+242.368764990" watchObservedRunningTime="2025-11-25 15:08:17.402356185 +0000 UTC m=+242.369949931" Nov 25 15:08:22 crc kubenswrapper[4965]: I1125 15:08:22.538403 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:08:22 crc kubenswrapper[4965]: I1125 15:08:22.539109 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:08:22 crc kubenswrapper[4965]: I1125 15:08:22.603423 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:08:23 crc kubenswrapper[4965]: I1125 15:08:23.064535 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:08:23 crc kubenswrapper[4965]: I1125 15:08:23.064957 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:08:23 crc kubenswrapper[4965]: I1125 15:08:23.132541 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:08:23 crc kubenswrapper[4965]: I1125 15:08:23.155368 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:08:23 crc kubenswrapper[4965]: I1125 15:08:23.156024 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:08:23 crc kubenswrapper[4965]: I1125 15:08:23.224764 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:08:23 crc kubenswrapper[4965]: I1125 15:08:23.467903 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:08:23 crc kubenswrapper[4965]: I1125 15:08:23.500447 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:08:23 crc kubenswrapper[4965]: I1125 15:08:23.503145 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:08:23 crc kubenswrapper[4965]: I1125 15:08:23.642711 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" podUID="8a01a344-a2a2-4d3c-9bc3-5e911936606c" containerName="oauth-openshift" containerID="cri-o://63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532" gracePeriod=15 Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.040460 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.091277 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5477954dc8-r9rft"] Nov 25 15:08:24 crc kubenswrapper[4965]: E1125 15:08:24.091467 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a01a344-a2a2-4d3c-9bc3-5e911936606c" containerName="oauth-openshift" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.091478 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a01a344-a2a2-4d3c-9bc3-5e911936606c" containerName="oauth-openshift" Nov 25 15:08:24 crc kubenswrapper[4965]: E1125 15:08:24.091491 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerName="registry-server" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.091497 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerName="registry-server" Nov 25 15:08:24 crc kubenswrapper[4965]: E1125 15:08:24.091510 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98f014b-3c55-409a-9d2c-4fc3318bf200" containerName="pruner" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.091516 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98f014b-3c55-409a-9d2c-4fc3318bf200" containerName="pruner" Nov 25 15:08:24 crc kubenswrapper[4965]: E1125 15:08:24.091528 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerName="extract-utilities" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.091533 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerName="extract-utilities" Nov 25 15:08:24 crc kubenswrapper[4965]: E1125 15:08:24.091541 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerName="extract-content" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.091546 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerName="extract-content" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.091628 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98f014b-3c55-409a-9d2c-4fc3318bf200" containerName="pruner" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.091639 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a01a344-a2a2-4d3c-9bc3-5e911936606c" containerName="oauth-openshift" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.091653 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6bd509-fa4d-47c9-a0ca-62eac6f717a0" containerName="registry-server" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.103197 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.108164 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5477954dc8-r9rft"] Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202384 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-error\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202437 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kxt7\" (UniqueName: \"kubernetes.io/projected/8a01a344-a2a2-4d3c-9bc3-5e911936606c-kube-api-access-6kxt7\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202517 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-dir\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202561 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-idp-0-file-data\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202588 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-serving-cert\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202628 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-ocp-branding-template\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202649 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-service-ca\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202673 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-trusted-ca-bundle\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202719 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-provider-selection\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202795 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-cliconfig\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202831 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-router-certs\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202887 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-login\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202913 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-policies\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.202980 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-session\") pod \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\" (UID: \"8a01a344-a2a2-4d3c-9bc3-5e911936606c\") " Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.203212 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-service-ca\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.203247 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-template-error\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.204649 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.204711 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-audit-dir\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.204776 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.204798 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-router-certs\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.204825 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.203619 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.205651 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.205770 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-session\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.204168 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.204224 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.204649 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.205139 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.205899 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.205930 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-template-login\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.205950 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwh7\" (UniqueName: \"kubernetes.io/projected/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-kube-api-access-7vwh7\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.205986 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.206011 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-audit-policies\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.206113 4965 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.206126 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.206136 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.206145 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.206155 4965 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8a01a344-a2a2-4d3c-9bc3-5e911936606c-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.208118 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.208577 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.209101 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.209206 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a01a344-a2a2-4d3c-9bc3-5e911936606c-kube-api-access-6kxt7" (OuterVolumeSpecName: "kube-api-access-6kxt7") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "kube-api-access-6kxt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.209710 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.210181 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.210418 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.210469 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.248718 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8a01a344-a2a2-4d3c-9bc3-5e911936606c" (UID: "8a01a344-a2a2-4d3c-9bc3-5e911936606c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308191 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308234 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-template-login\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308251 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwh7\" (UniqueName: \"kubernetes.io/projected/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-kube-api-access-7vwh7\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308272 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308292 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-audit-policies\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308316 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-service-ca\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308333 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-template-error\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308350 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308374 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-audit-dir\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308392 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308408 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-router-certs\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308427 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308454 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308473 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-session\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308517 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308531 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308543 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308555 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308567 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kxt7\" (UniqueName: \"kubernetes.io/projected/8a01a344-a2a2-4d3c-9bc3-5e911936606c-kube-api-access-6kxt7\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308580 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308588 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308599 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308608 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8a01a344-a2a2-4d3c-9bc3-5e911936606c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.308843 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-audit-dir\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.309370 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-audit-policies\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.309529 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-service-ca\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.311064 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.311842 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.312371 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-session\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.312371 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-template-login\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.313503 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-router-certs\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.313745 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.314953 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.315447 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.315716 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.316323 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-v4-0-config-user-template-error\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.323174 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwh7\" (UniqueName: \"kubernetes.io/projected/c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd-kube-api-access-7vwh7\") pod \"oauth-openshift-5477954dc8-r9rft\" (UID: \"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd\") " pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.398807 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.398882 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.423904 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.439328 4965 generic.go:334] "Generic (PLEG): container finished" podID="8a01a344-a2a2-4d3c-9bc3-5e911936606c" containerID="63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532" exitCode=0 Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.439788 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" event={"ID":"8a01a344-a2a2-4d3c-9bc3-5e911936606c","Type":"ContainerDied","Data":"63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532"} Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.439867 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" event={"ID":"8a01a344-a2a2-4d3c-9bc3-5e911936606c","Type":"ContainerDied","Data":"de8828ac02ad923511d9e829e6984e10bd0ea4ed7731c269483f7316ebb1fdf8"} Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.439909 4965 scope.go:117] "RemoveContainer" containerID="63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.440177 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fjgbs" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.441203 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.475648 4965 scope.go:117] "RemoveContainer" containerID="63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532" Nov 25 15:08:24 crc kubenswrapper[4965]: E1125 15:08:24.476293 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532\": container with ID starting with 63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532 not found: ID does not exist" containerID="63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.476435 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532"} err="failed to get container status \"63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532\": rpc error: code = NotFound desc = could not find container \"63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532\": container with ID starting with 63116483e2aa785a0e421e83a397619ab6e99eac200396e487d6459e25c54532 not found: ID does not exist" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.490476 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fjgbs"] Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.496711 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fjgbs"] Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.500917 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.746558 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.746598 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.791540 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a01a344-a2a2-4d3c-9bc3-5e911936606c" path="/var/lib/kubelet/pods/8a01a344-a2a2-4d3c-9bc3-5e911936606c/volumes" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.793373 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:08:24 crc kubenswrapper[4965]: I1125 15:08:24.842840 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5477954dc8-r9rft"] Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.447927 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" event={"ID":"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd","Type":"ContainerStarted","Data":"0bfba77ec6df7ade40c4be17f0116494e07fbdc797db279d8fd1317f75c5e24e"} Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.448746 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.448800 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" event={"ID":"c9c0f6f9-59d5-4b30-938e-22dc9b61a1cd","Type":"ContainerStarted","Data":"c58333034c2acfb05778ff0b8e46de2b1cb2f1ec49cd15b60afa363b20db3c6a"} Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.456367 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.484753 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5477954dc8-r9rft" podStartSLOduration=27.484732635 podStartE2EDuration="27.484732635s" podCreationTimestamp="2025-11-25 15:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:08:25.470553139 +0000 UTC m=+250.438146885" watchObservedRunningTime="2025-11-25 15:08:25.484732635 +0000 UTC m=+250.452326381" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.486904 4965 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487063 4965 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 15:08:25 crc kubenswrapper[4965]: E1125 15:08:25.487314 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487330 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 25 15:08:25 crc kubenswrapper[4965]: E1125 15:08:25.487341 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487348 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 15:08:25 crc kubenswrapper[4965]: E1125 15:08:25.487360 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487368 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 15:08:25 crc kubenswrapper[4965]: E1125 15:08:25.487381 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487388 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 15:08:25 crc kubenswrapper[4965]: E1125 15:08:25.487399 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487407 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 15:08:25 crc kubenswrapper[4965]: E1125 15:08:25.487419 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487427 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 15:08:25 crc kubenswrapper[4965]: E1125 15:08:25.487444 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487451 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487530 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a" gracePeriod=15 Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487594 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487613 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487625 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487634 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487645 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487656 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487696 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a" gracePeriod=15 Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487751 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19" gracePeriod=15 Nov 25 15:08:25 crc kubenswrapper[4965]: E1125 15:08:25.487788 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487794 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c" gracePeriod=15 Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487800 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487859 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9" gracePeriod=15 Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.487999 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.491900 4965 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.492804 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.505192 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.527408 4965 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.627701 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.627792 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.627844 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.627897 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.627920 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.627941 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.628104 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.628153 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729176 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729231 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729289 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729304 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729325 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729386 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729466 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729493 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729549 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729616 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729647 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729685 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729728 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729735 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729770 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:25 crc kubenswrapper[4965]: I1125 15:08:25.729800 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:26 crc kubenswrapper[4965]: I1125 15:08:26.454442 4965 generic.go:334] "Generic (PLEG): container finished" podID="a927b1ca-0619-4f2f-89bd-5583da792645" containerID="61ea41873abba05582e0d11a94923835c7bd41458625f5e7b926cb2b917f323d" exitCode=0 Nov 25 15:08:26 crc kubenswrapper[4965]: I1125 15:08:26.454524 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a927b1ca-0619-4f2f-89bd-5583da792645","Type":"ContainerDied","Data":"61ea41873abba05582e0d11a94923835c7bd41458625f5e7b926cb2b917f323d"} Nov 25 15:08:26 crc kubenswrapper[4965]: I1125 15:08:26.457071 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 25 15:08:26 crc kubenswrapper[4965]: I1125 15:08:26.458761 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 15:08:26 crc kubenswrapper[4965]: I1125 15:08:26.459738 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a" exitCode=0 Nov 25 15:08:26 crc kubenswrapper[4965]: I1125 15:08:26.459776 4965 scope.go:117] "RemoveContainer" containerID="0a4124203958a89a3a6aeb90a1fd71a14264348ef4629c08408e9cd9b35159c3" Nov 25 15:08:26 crc kubenswrapper[4965]: I1125 15:08:26.459847 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19" exitCode=0 Nov 25 15:08:26 crc kubenswrapper[4965]: I1125 15:08:26.459869 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c" exitCode=0 Nov 25 15:08:26 crc kubenswrapper[4965]: I1125 15:08:26.459884 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9" exitCode=2 Nov 25 15:08:27 crc kubenswrapper[4965]: E1125 15:08:27.066829 4965 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:27 crc kubenswrapper[4965]: E1125 15:08:27.067879 4965 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:27 crc kubenswrapper[4965]: E1125 15:08:27.068361 4965 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:27 crc kubenswrapper[4965]: E1125 15:08:27.068857 4965 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:27 crc kubenswrapper[4965]: E1125 15:08:27.069274 4965 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.069313 4965 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 25 15:08:27 crc kubenswrapper[4965]: E1125 15:08:27.069677 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="200ms" Nov 25 15:08:27 crc kubenswrapper[4965]: E1125 15:08:27.271438 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="400ms" Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.469136 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 15:08:27 crc kubenswrapper[4965]: E1125 15:08:27.677206 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="800ms" Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.895369 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.952148 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.952857 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.965369 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-var-lock\") pod \"a927b1ca-0619-4f2f-89bd-5583da792645\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.965453 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a927b1ca-0619-4f2f-89bd-5583da792645-kube-api-access\") pod \"a927b1ca-0619-4f2f-89bd-5583da792645\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.965487 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-var-lock" (OuterVolumeSpecName: "var-lock") pod "a927b1ca-0619-4f2f-89bd-5583da792645" (UID: "a927b1ca-0619-4f2f-89bd-5583da792645"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.965530 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-kubelet-dir\") pod \"a927b1ca-0619-4f2f-89bd-5583da792645\" (UID: \"a927b1ca-0619-4f2f-89bd-5583da792645\") " Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.965726 4965 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-var-lock\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.965749 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a927b1ca-0619-4f2f-89bd-5583da792645" (UID: "a927b1ca-0619-4f2f-89bd-5583da792645"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:08:27 crc kubenswrapper[4965]: I1125 15:08:27.970523 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a927b1ca-0619-4f2f-89bd-5583da792645-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a927b1ca-0619-4f2f-89bd-5583da792645" (UID: "a927b1ca-0619-4f2f-89bd-5583da792645"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066433 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066536 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066561 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066570 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066602 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066653 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066873 4965 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a927b1ca-0619-4f2f-89bd-5583da792645-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066893 4965 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066905 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a927b1ca-0619-4f2f-89bd-5583da792645-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066918 4965 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.066928 4965 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:08:28 crc kubenswrapper[4965]: E1125 15:08:28.477570 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="1.6s" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.480343 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.481158 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a" exitCode=0 Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.481245 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.481252 4965 scope.go:117] "RemoveContainer" containerID="1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.484448 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a927b1ca-0619-4f2f-89bd-5583da792645","Type":"ContainerDied","Data":"cd7a1e4fe723a5e8cdf73b2685663f4bbab6fb26c309107c15d37b0748e1da91"} Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.484490 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd7a1e4fe723a5e8cdf73b2685663f4bbab6fb26c309107c15d37b0748e1da91" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.484538 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.556514 4965 scope.go:117] "RemoveContainer" containerID="9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.576854 4965 scope.go:117] "RemoveContainer" containerID="25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.589999 4965 scope.go:117] "RemoveContainer" containerID="d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.609307 4965 scope.go:117] "RemoveContainer" containerID="29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.621865 4965 scope.go:117] "RemoveContainer" containerID="869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.645528 4965 scope.go:117] "RemoveContainer" containerID="1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a" Nov 25 15:08:28 crc kubenswrapper[4965]: E1125 15:08:28.645898 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\": container with ID starting with 1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a not found: ID does not exist" containerID="1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.645957 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a"} err="failed to get container status \"1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\": rpc error: code = NotFound desc = could not find container \"1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a\": container with ID starting with 1a64f22a96231b758ca49bd8887bc6268eecd480b5e5e02a6a4cfda68615b31a not found: ID does not exist" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.646015 4965 scope.go:117] "RemoveContainer" containerID="9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19" Nov 25 15:08:28 crc kubenswrapper[4965]: E1125 15:08:28.646309 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\": container with ID starting with 9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19 not found: ID does not exist" containerID="9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.646334 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19"} err="failed to get container status \"9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\": rpc error: code = NotFound desc = could not find container \"9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19\": container with ID starting with 9cdb83c5586505cd4e6d926c36404ed23fad22bc1e268c62c0cf4e44a47b8b19 not found: ID does not exist" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.646356 4965 scope.go:117] "RemoveContainer" containerID="25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c" Nov 25 15:08:28 crc kubenswrapper[4965]: E1125 15:08:28.646954 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\": container with ID starting with 25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c not found: ID does not exist" containerID="25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.646993 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c"} err="failed to get container status \"25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\": rpc error: code = NotFound desc = could not find container \"25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c\": container with ID starting with 25ec6e4f7b649e3ed86ad1e62e2101ded82feaa24e2bed128ecccf2033782e1c not found: ID does not exist" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.647007 4965 scope.go:117] "RemoveContainer" containerID="d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9" Nov 25 15:08:28 crc kubenswrapper[4965]: E1125 15:08:28.647256 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\": container with ID starting with d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9 not found: ID does not exist" containerID="d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.647273 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9"} err="failed to get container status \"d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\": rpc error: code = NotFound desc = could not find container \"d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9\": container with ID starting with d6636458941dceed6de40be3fc3cf6ff14e3a2b767e331c29f3b43e9bbb32bf9 not found: ID does not exist" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.647285 4965 scope.go:117] "RemoveContainer" containerID="29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a" Nov 25 15:08:28 crc kubenswrapper[4965]: E1125 15:08:28.647529 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\": container with ID starting with 29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a not found: ID does not exist" containerID="29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.647544 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a"} err="failed to get container status \"29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\": rpc error: code = NotFound desc = could not find container \"29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a\": container with ID starting with 29ad4b497169d89c7ef55c9b1f75a3b211dc2d9a8fb6f4d1e68355182250b56a not found: ID does not exist" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.647554 4965 scope.go:117] "RemoveContainer" containerID="869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8" Nov 25 15:08:28 crc kubenswrapper[4965]: E1125 15:08:28.647822 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\": container with ID starting with 869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8 not found: ID does not exist" containerID="869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.647853 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8"} err="failed to get container status \"869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\": rpc error: code = NotFound desc = could not find container \"869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8\": container with ID starting with 869e7e96dbea05689addfc3af6c4201c65d2617e4154f396155f3a607dd3c2a8 not found: ID does not exist" Nov 25 15:08:28 crc kubenswrapper[4965]: I1125 15:08:28.791139 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 25 15:08:30 crc kubenswrapper[4965]: E1125 15:08:30.079403 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="3.2s" Nov 25 15:08:30 crc kubenswrapper[4965]: I1125 15:08:30.529831 4965 status_manager.go:851] "Failed to get status for pod" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" pod="openshift-marketplace/redhat-marketplace-g9kt4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g9kt4\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:30 crc kubenswrapper[4965]: I1125 15:08:30.536091 4965 status_manager.go:851] "Failed to get status for pod" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" pod="openshift-marketplace/redhat-marketplace-g9kt4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g9kt4\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:30 crc kubenswrapper[4965]: I1125 15:08:30.536884 4965 status_manager.go:851] "Failed to get status for pod" podUID="a927b1ca-0619-4f2f-89bd-5583da792645" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:30 crc kubenswrapper[4965]: E1125 15:08:30.537458 4965 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.176:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:30 crc kubenswrapper[4965]: I1125 15:08:30.537516 4965 status_manager.go:851] "Failed to get status for pod" podUID="a927b1ca-0619-4f2f-89bd-5583da792645" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:30 crc kubenswrapper[4965]: I1125 15:08:30.538129 4965 status_manager.go:851] "Failed to get status for pod" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" pod="openshift-marketplace/redhat-marketplace-g9kt4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g9kt4\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:30 crc kubenswrapper[4965]: I1125 15:08:30.538158 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:30 crc kubenswrapper[4965]: E1125 15:08:30.578522 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.176:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b486ca92c0554 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 15:08:30.577583444 +0000 UTC m=+255.545177190,LastTimestamp:2025-11-25 15:08:30.577583444 +0000 UTC m=+255.545177190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 15:08:31 crc kubenswrapper[4965]: I1125 15:08:31.508655 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4"} Nov 25 15:08:31 crc kubenswrapper[4965]: I1125 15:08:31.509007 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"10bf3ba125ae0fdd225c4750a04be06cd50755f38f3467cf9f110edc290dfda4"} Nov 25 15:08:31 crc kubenswrapper[4965]: I1125 15:08:31.509538 4965 status_manager.go:851] "Failed to get status for pod" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" pod="openshift-marketplace/redhat-marketplace-g9kt4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g9kt4\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:31 crc kubenswrapper[4965]: E1125 15:08:31.509611 4965 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.176:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:08:31 crc kubenswrapper[4965]: I1125 15:08:31.509758 4965 status_manager.go:851] "Failed to get status for pod" podUID="a927b1ca-0619-4f2f-89bd-5583da792645" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:33 crc kubenswrapper[4965]: E1125 15:08:33.280734 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="6.4s" Nov 25 15:08:33 crc kubenswrapper[4965]: E1125 15:08:33.804949 4965 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.176:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" volumeName="registry-storage" Nov 25 15:08:34 crc kubenswrapper[4965]: E1125 15:08:34.603418 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.176:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b486ca92c0554 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 15:08:30.577583444 +0000 UTC m=+255.545177190,LastTimestamp:2025-11-25 15:08:30.577583444 +0000 UTC m=+255.545177190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 15:08:36 crc kubenswrapper[4965]: I1125 15:08:36.776898 4965 status_manager.go:851] "Failed to get status for pod" podUID="a927b1ca-0619-4f2f-89bd-5583da792645" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:36 crc kubenswrapper[4965]: I1125 15:08:36.777936 4965 status_manager.go:851] "Failed to get status for pod" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" pod="openshift-marketplace/redhat-marketplace-g9kt4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g9kt4\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:38 crc kubenswrapper[4965]: I1125 15:08:38.770864 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:38 crc kubenswrapper[4965]: I1125 15:08:38.772107 4965 status_manager.go:851] "Failed to get status for pod" podUID="a927b1ca-0619-4f2f-89bd-5583da792645" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:38 crc kubenswrapper[4965]: I1125 15:08:38.772512 4965 status_manager.go:851] "Failed to get status for pod" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" pod="openshift-marketplace/redhat-marketplace-g9kt4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g9kt4\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:38 crc kubenswrapper[4965]: I1125 15:08:38.784064 4965 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a97f603b-f66c-42c7-9916-56c991135ede" Nov 25 15:08:38 crc kubenswrapper[4965]: I1125 15:08:38.784314 4965 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a97f603b-f66c-42c7-9916-56c991135ede" Nov 25 15:08:38 crc kubenswrapper[4965]: E1125 15:08:38.784999 4965 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:38 crc kubenswrapper[4965]: I1125 15:08:38.785791 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:38 crc kubenswrapper[4965]: W1125 15:08:38.803890 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-6747b1cdca0f6ba9d39e7599b5828737b7eac8e8d82546c0f110a180f4a0455e WatchSource:0}: Error finding container 6747b1cdca0f6ba9d39e7599b5828737b7eac8e8d82546c0f110a180f4a0455e: Status 404 returned error can't find the container with id 6747b1cdca0f6ba9d39e7599b5828737b7eac8e8d82546c0f110a180f4a0455e Nov 25 15:08:39 crc kubenswrapper[4965]: I1125 15:08:39.567559 4965 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="a2ff41cee3a2ff7c47f6322a6a6daac4a467b5fc5b27a5cfd69046debb653c6b" exitCode=0 Nov 25 15:08:39 crc kubenswrapper[4965]: I1125 15:08:39.567657 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"a2ff41cee3a2ff7c47f6322a6a6daac4a467b5fc5b27a5cfd69046debb653c6b"} Nov 25 15:08:39 crc kubenswrapper[4965]: I1125 15:08:39.568048 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6747b1cdca0f6ba9d39e7599b5828737b7eac8e8d82546c0f110a180f4a0455e"} Nov 25 15:08:39 crc kubenswrapper[4965]: I1125 15:08:39.568474 4965 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a97f603b-f66c-42c7-9916-56c991135ede" Nov 25 15:08:39 crc kubenswrapper[4965]: I1125 15:08:39.568508 4965 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a97f603b-f66c-42c7-9916-56c991135ede" Nov 25 15:08:39 crc kubenswrapper[4965]: I1125 15:08:39.569169 4965 status_manager.go:851] "Failed to get status for pod" podUID="a927b1ca-0619-4f2f-89bd-5583da792645" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:39 crc kubenswrapper[4965]: E1125 15:08:39.569294 4965 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:39 crc kubenswrapper[4965]: I1125 15:08:39.569730 4965 status_manager.go:851] "Failed to get status for pod" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" pod="openshift-marketplace/redhat-marketplace-g9kt4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g9kt4\": dial tcp 38.102.83.176:6443: connect: connection refused" Nov 25 15:08:39 crc kubenswrapper[4965]: E1125 15:08:39.682354 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="7s" Nov 25 15:08:40 crc kubenswrapper[4965]: I1125 15:08:40.578861 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 25 15:08:40 crc kubenswrapper[4965]: I1125 15:08:40.579202 4965 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53" exitCode=1 Nov 25 15:08:40 crc kubenswrapper[4965]: I1125 15:08:40.579264 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53"} Nov 25 15:08:40 crc kubenswrapper[4965]: I1125 15:08:40.579691 4965 scope.go:117] "RemoveContainer" containerID="98cb48f486da797da99f2abb6cbadcf259dd750e380ddc047ca53738aaf07e53" Nov 25 15:08:40 crc kubenswrapper[4965]: I1125 15:08:40.583435 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e493d5c17e8801df1c6478e718c3ce2dd090388bea7bf36ee889f263639cb11c"} Nov 25 15:08:40 crc kubenswrapper[4965]: I1125 15:08:40.583671 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"59f466fda81d89caf6a8950dafd82a7e280d6f577d8e3c34c57636a73a8e96d7"} Nov 25 15:08:40 crc kubenswrapper[4965]: I1125 15:08:40.583681 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fb5dbf73ab32e56722d0090bbc052627f1633bf9f578b7db21b0aa03dbf36925"} Nov 25 15:08:40 crc kubenswrapper[4965]: I1125 15:08:40.583689 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e43d5640767b09bc54be7b8cd75d27060aed779e3a9bbaedaa7d4e5e8db9c85c"} Nov 25 15:08:41 crc kubenswrapper[4965]: I1125 15:08:41.591367 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 25 15:08:41 crc kubenswrapper[4965]: I1125 15:08:41.591458 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0c827b30d8f1762a761738297d804e6ecf1239b0c8a23e651f8ff126ff349fa4"} Nov 25 15:08:41 crc kubenswrapper[4965]: I1125 15:08:41.594139 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a0949c16e634854bf13de616758ff230a3e34fdfb717359f2835976b6621b7a1"} Nov 25 15:08:41 crc kubenswrapper[4965]: I1125 15:08:41.594298 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:41 crc kubenswrapper[4965]: I1125 15:08:41.594357 4965 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a97f603b-f66c-42c7-9916-56c991135ede" Nov 25 15:08:41 crc kubenswrapper[4965]: I1125 15:08:41.594384 4965 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a97f603b-f66c-42c7-9916-56c991135ede" Nov 25 15:08:43 crc kubenswrapper[4965]: I1125 15:08:43.787362 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:43 crc kubenswrapper[4965]: I1125 15:08:43.787760 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:43 crc kubenswrapper[4965]: I1125 15:08:43.793451 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:44 crc kubenswrapper[4965]: I1125 15:08:44.117797 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:08:44 crc kubenswrapper[4965]: I1125 15:08:44.124400 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:08:44 crc kubenswrapper[4965]: I1125 15:08:44.610938 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:08:46 crc kubenswrapper[4965]: I1125 15:08:46.604463 4965 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:46 crc kubenswrapper[4965]: I1125 15:08:46.620816 4965 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a97f603b-f66c-42c7-9916-56c991135ede" Nov 25 15:08:46 crc kubenswrapper[4965]: I1125 15:08:46.620848 4965 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a97f603b-f66c-42c7-9916-56c991135ede" Nov 25 15:08:46 crc kubenswrapper[4965]: I1125 15:08:46.624798 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:08:46 crc kubenswrapper[4965]: I1125 15:08:46.788585 4965 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="89fe6f51-a7f3-4ccd-8e4d-563795e03938" Nov 25 15:08:47 crc kubenswrapper[4965]: I1125 15:08:47.625548 4965 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a97f603b-f66c-42c7-9916-56c991135ede" Nov 25 15:08:47 crc kubenswrapper[4965]: I1125 15:08:47.625573 4965 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a97f603b-f66c-42c7-9916-56c991135ede" Nov 25 15:08:47 crc kubenswrapper[4965]: I1125 15:08:47.628228 4965 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="89fe6f51-a7f3-4ccd-8e4d-563795e03938" Nov 25 15:08:56 crc kubenswrapper[4965]: I1125 15:08:56.209150 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 25 15:08:56 crc kubenswrapper[4965]: I1125 15:08:56.416664 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 25 15:08:56 crc kubenswrapper[4965]: I1125 15:08:56.846960 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 25 15:08:57 crc kubenswrapper[4965]: I1125 15:08:57.002129 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 25 15:08:57 crc kubenswrapper[4965]: I1125 15:08:57.247799 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 25 15:08:57 crc kubenswrapper[4965]: I1125 15:08:57.579390 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 25 15:08:57 crc kubenswrapper[4965]: I1125 15:08:57.711499 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 25 15:08:57 crc kubenswrapper[4965]: I1125 15:08:57.754509 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 15:08:57 crc kubenswrapper[4965]: I1125 15:08:57.778379 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 25 15:08:57 crc kubenswrapper[4965]: I1125 15:08:57.960300 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.072827 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.084377 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.092725 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.114605 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.153818 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.194690 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.200543 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.245482 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.303226 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.376803 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.765359 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.895409 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.897769 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.902612 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 25 15:08:58 crc kubenswrapper[4965]: I1125 15:08:58.979479 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 25 15:08:59 crc kubenswrapper[4965]: I1125 15:08:59.130433 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 25 15:08:59 crc kubenswrapper[4965]: I1125 15:08:59.221221 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 25 15:08:59 crc kubenswrapper[4965]: I1125 15:08:59.549458 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 25 15:08:59 crc kubenswrapper[4965]: I1125 15:08:59.591813 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 25 15:08:59 crc kubenswrapper[4965]: I1125 15:08:59.617497 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 25 15:08:59 crc kubenswrapper[4965]: I1125 15:08:59.907485 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 25 15:08:59 crc kubenswrapper[4965]: I1125 15:08:59.919553 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 25 15:08:59 crc kubenswrapper[4965]: I1125 15:08:59.952412 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 15:08:59 crc kubenswrapper[4965]: I1125 15:08:59.956909 4965 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 25 15:08:59 crc kubenswrapper[4965]: I1125 15:08:59.995680 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.005165 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.024683 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.053593 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.082391 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.085280 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.088718 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.161639 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.215539 4965 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.221388 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.221475 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.226139 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.241370 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.241354102 podStartE2EDuration="14.241354102s" podCreationTimestamp="2025-11-25 15:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:09:00.240185721 +0000 UTC m=+285.207779477" watchObservedRunningTime="2025-11-25 15:09:00.241354102 +0000 UTC m=+285.208947858" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.244076 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.410792 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.473459 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.482385 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.584098 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.593667 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.637630 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.697398 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.764630 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.865566 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 25 15:09:00 crc kubenswrapper[4965]: I1125 15:09:00.940598 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.007034 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.011061 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.060854 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.144779 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.186068 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.234020 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.245322 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.286016 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.313471 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.317459 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.440385 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.571451 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.625771 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.700674 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.878649 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 25 15:09:01 crc kubenswrapper[4965]: I1125 15:09:01.891261 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.022423 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.172206 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.214839 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.228571 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.238435 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.251444 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.359334 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.373152 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.373171 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.373226 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.373908 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.457308 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.520345 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.662270 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.735564 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.767914 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 25 15:09:02 crc kubenswrapper[4965]: I1125 15:09:02.840315 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.042179 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.062138 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.158838 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.271475 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.293155 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.331561 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.368627 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.417685 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.458282 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.661895 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.756258 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.803522 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.844549 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.856590 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.872752 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.889374 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.921129 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.934234 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.937793 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 25 15:09:03 crc kubenswrapper[4965]: I1125 15:09:03.961211 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.063139 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.067442 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.147214 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.170703 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.269033 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.272358 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.480568 4965 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.575989 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.721256 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.741449 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.844821 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.876773 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 25 15:09:04 crc kubenswrapper[4965]: I1125 15:09:04.956921 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.014198 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.034407 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.041427 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.080160 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.120375 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.124450 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.264535 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.294294 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.383427 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.399121 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.489436 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.498512 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.528247 4965 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.643999 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.666521 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.701184 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.756736 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.813388 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.822451 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.855043 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 25 15:09:05 crc kubenswrapper[4965]: I1125 15:09:05.869978 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.046200 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.052579 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.117184 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.133486 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.145653 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.196244 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.249841 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.255907 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.264782 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.294687 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.368154 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.402829 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.489686 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.570606 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.639550 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.682413 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.725889 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 25 15:09:06 crc kubenswrapper[4965]: I1125 15:09:06.859911 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.055267 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.221339 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.244420 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.280707 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.309664 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.321049 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.331718 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.541716 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.584634 4965 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.701148 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.816092 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.868805 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 25 15:09:07 crc kubenswrapper[4965]: I1125 15:09:07.910512 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.019484 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.032064 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.083929 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.120043 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.120338 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.137348 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.171149 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.195763 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.246416 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.260352 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.285609 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.355676 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.367061 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.381047 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.381529 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.425766 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.493648 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.536505 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.552403 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.609704 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.756303 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.764811 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.790451 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.812812 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 15:09:08 crc kubenswrapper[4965]: I1125 15:09:08.829189 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.012566 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.150121 4965 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.150334 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4" gracePeriod=5 Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.233185 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.290643 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.314394 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.390205 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.452752 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.481582 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.607560 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.674673 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.679701 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.784881 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.793908 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.802787 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.807461 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.903153 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.944330 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 25 15:09:09 crc kubenswrapper[4965]: I1125 15:09:09.950806 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.097915 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.130873 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.183203 4965 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.312134 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.356150 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.408243 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.420407 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.496406 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.505732 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.653724 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.689354 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.741237 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.820290 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.826125 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 25 15:09:10 crc kubenswrapper[4965]: I1125 15:09:10.997863 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.009509 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.043368 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.070773 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.259325 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.293026 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.428308 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.547217 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.607604 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.649676 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.817378 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 25 15:09:11 crc kubenswrapper[4965]: I1125 15:09:11.960910 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 25 15:09:12 crc kubenswrapper[4965]: I1125 15:09:12.022165 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 25 15:09:12 crc kubenswrapper[4965]: I1125 15:09:12.039511 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 25 15:09:12 crc kubenswrapper[4965]: I1125 15:09:12.161682 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 25 15:09:12 crc kubenswrapper[4965]: I1125 15:09:12.545031 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 25 15:09:12 crc kubenswrapper[4965]: I1125 15:09:12.593135 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 25 15:09:12 crc kubenswrapper[4965]: I1125 15:09:12.623070 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 25 15:09:12 crc kubenswrapper[4965]: I1125 15:09:12.930347 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 25 15:09:13 crc kubenswrapper[4965]: I1125 15:09:13.136377 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 25 15:09:13 crc kubenswrapper[4965]: I1125 15:09:13.177706 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 15:09:13 crc kubenswrapper[4965]: I1125 15:09:13.505718 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 15:09:13 crc kubenswrapper[4965]: I1125 15:09:13.622451 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 25 15:09:13 crc kubenswrapper[4965]: I1125 15:09:13.667613 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.695959 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.757635 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.757762 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.776977 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.777015 4965 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4" exitCode=137 Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.777073 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.788287 4965 scope.go:117] "RemoveContainer" containerID="365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.800479 4965 scope.go:117] "RemoveContainer" containerID="365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4" Nov 25 15:09:14 crc kubenswrapper[4965]: E1125 15:09:14.801069 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4\": container with ID starting with 365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4 not found: ID does not exist" containerID="365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.801187 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4"} err="failed to get container status \"365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4\": rpc error: code = NotFound desc = could not find container \"365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4\": container with ID starting with 365620749cffa3cac1b06640d28a2363a7280762f2a6cecb5e92095cd1d5f9b4 not found: ID does not exist" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.864525 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.864604 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.864651 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.864673 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.864675 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.864756 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.864807 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.864864 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.864947 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.865300 4965 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.865326 4965 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.865342 4965 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.865359 4965 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.888305 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:09:14 crc kubenswrapper[4965]: I1125 15:09:14.966602 4965 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:16 crc kubenswrapper[4965]: I1125 15:09:16.783531 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.216495 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rgvb"] Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.217266 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" podUID="77477a76-df54-4755-89b0-9b2ec40e098d" containerName="controller-manager" containerID="cri-o://7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8" gracePeriod=30 Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.306570 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52"] Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.306788 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" podUID="78e98b3d-733f-4b7a-abcf-950d6870c04f" containerName="route-controller-manager" containerID="cri-o://55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608" gracePeriod=30 Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.574302 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.707816 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.731114 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgbdl\" (UniqueName: \"kubernetes.io/projected/77477a76-df54-4755-89b0-9b2ec40e098d-kube-api-access-lgbdl\") pod \"77477a76-df54-4755-89b0-9b2ec40e098d\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.731164 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-config\") pod \"77477a76-df54-4755-89b0-9b2ec40e098d\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.731193 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-client-ca\") pod \"77477a76-df54-4755-89b0-9b2ec40e098d\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.731303 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-config\") pod \"78e98b3d-733f-4b7a-abcf-950d6870c04f\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.731338 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78e98b3d-733f-4b7a-abcf-950d6870c04f-serving-cert\") pod \"78e98b3d-733f-4b7a-abcf-950d6870c04f\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.731369 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77477a76-df54-4755-89b0-9b2ec40e098d-serving-cert\") pod \"77477a76-df54-4755-89b0-9b2ec40e098d\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.731391 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-proxy-ca-bundles\") pod \"77477a76-df54-4755-89b0-9b2ec40e098d\" (UID: \"77477a76-df54-4755-89b0-9b2ec40e098d\") " Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.732060 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "77477a76-df54-4755-89b0-9b2ec40e098d" (UID: "77477a76-df54-4755-89b0-9b2ec40e098d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.732346 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-config" (OuterVolumeSpecName: "config") pod "78e98b3d-733f-4b7a-abcf-950d6870c04f" (UID: "78e98b3d-733f-4b7a-abcf-950d6870c04f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.732360 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-config" (OuterVolumeSpecName: "config") pod "77477a76-df54-4755-89b0-9b2ec40e098d" (UID: "77477a76-df54-4755-89b0-9b2ec40e098d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.732833 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-client-ca" (OuterVolumeSpecName: "client-ca") pod "77477a76-df54-4755-89b0-9b2ec40e098d" (UID: "77477a76-df54-4755-89b0-9b2ec40e098d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.738116 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77477a76-df54-4755-89b0-9b2ec40e098d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "77477a76-df54-4755-89b0-9b2ec40e098d" (UID: "77477a76-df54-4755-89b0-9b2ec40e098d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.738214 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77477a76-df54-4755-89b0-9b2ec40e098d-kube-api-access-lgbdl" (OuterVolumeSpecName: "kube-api-access-lgbdl") pod "77477a76-df54-4755-89b0-9b2ec40e098d" (UID: "77477a76-df54-4755-89b0-9b2ec40e098d"). InnerVolumeSpecName "kube-api-access-lgbdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.738272 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78e98b3d-733f-4b7a-abcf-950d6870c04f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78e98b3d-733f-4b7a-abcf-950d6870c04f" (UID: "78e98b3d-733f-4b7a-abcf-950d6870c04f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.832535 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-client-ca\") pod \"78e98b3d-733f-4b7a-abcf-950d6870c04f\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.832959 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km5sj\" (UniqueName: \"kubernetes.io/projected/78e98b3d-733f-4b7a-abcf-950d6870c04f-kube-api-access-km5sj\") pod \"78e98b3d-733f-4b7a-abcf-950d6870c04f\" (UID: \"78e98b3d-733f-4b7a-abcf-950d6870c04f\") " Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.833480 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgbdl\" (UniqueName: \"kubernetes.io/projected/77477a76-df54-4755-89b0-9b2ec40e098d-kube-api-access-lgbdl\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.833734 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.833758 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.833772 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.833990 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78e98b3d-733f-4b7a-abcf-950d6870c04f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.834006 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77477a76-df54-4755-89b0-9b2ec40e098d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.834018 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77477a76-df54-4755-89b0-9b2ec40e098d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.834096 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-client-ca" (OuterVolumeSpecName: "client-ca") pod "78e98b3d-733f-4b7a-abcf-950d6870c04f" (UID: "78e98b3d-733f-4b7a-abcf-950d6870c04f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.837221 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78e98b3d-733f-4b7a-abcf-950d6870c04f-kube-api-access-km5sj" (OuterVolumeSpecName: "kube-api-access-km5sj") pod "78e98b3d-733f-4b7a-abcf-950d6870c04f" (UID: "78e98b3d-733f-4b7a-abcf-950d6870c04f"). InnerVolumeSpecName "kube-api-access-km5sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.876365 4965 generic.go:334] "Generic (PLEG): container finished" podID="77477a76-df54-4755-89b0-9b2ec40e098d" containerID="7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8" exitCode=0 Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.876410 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.876454 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" event={"ID":"77477a76-df54-4755-89b0-9b2ec40e098d","Type":"ContainerDied","Data":"7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8"} Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.876501 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8rgvb" event={"ID":"77477a76-df54-4755-89b0-9b2ec40e098d","Type":"ContainerDied","Data":"8b54bc2783c91d73c9fbd0507c65f19d9f00cb999e1d93326ebbd4a7313b6c3b"} Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.876522 4965 scope.go:117] "RemoveContainer" containerID="7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.880018 4965 generic.go:334] "Generic (PLEG): container finished" podID="78e98b3d-733f-4b7a-abcf-950d6870c04f" containerID="55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608" exitCode=0 Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.880111 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" event={"ID":"78e98b3d-733f-4b7a-abcf-950d6870c04f","Type":"ContainerDied","Data":"55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608"} Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.880034 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.880148 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52" event={"ID":"78e98b3d-733f-4b7a-abcf-950d6870c04f","Type":"ContainerDied","Data":"9179ed7262685d2e4053cf19623860c7f497c9d073c31f73b4bf7fec48e88d99"} Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.896151 4965 scope.go:117] "RemoveContainer" containerID="7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8" Nov 25 15:09:32 crc kubenswrapper[4965]: E1125 15:09:32.897899 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8\": container with ID starting with 7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8 not found: ID does not exist" containerID="7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.897940 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8"} err="failed to get container status \"7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8\": rpc error: code = NotFound desc = could not find container \"7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8\": container with ID starting with 7d464f8fae1bfaa310425aa40feb5a7e9b08de6224d916e09f48ac61df99f7c8 not found: ID does not exist" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.897980 4965 scope.go:117] "RemoveContainer" containerID="55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.901770 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rgvb"] Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.909108 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rgvb"] Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.911905 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52"] Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.915922 4965 scope.go:117] "RemoveContainer" containerID="55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608" Nov 25 15:09:32 crc kubenswrapper[4965]: E1125 15:09:32.916350 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608\": container with ID starting with 55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608 not found: ID does not exist" containerID="55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.916421 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608"} err="failed to get container status \"55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608\": rpc error: code = NotFound desc = could not find container \"55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608\": container with ID starting with 55126c94cc46dcc149d80abbd674726d5d0b72470e28100c952360ccb3679608 not found: ID does not exist" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.916569 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ndw52"] Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.934615 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km5sj\" (UniqueName: \"kubernetes.io/projected/78e98b3d-733f-4b7a-abcf-950d6870c04f-kube-api-access-km5sj\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:32 crc kubenswrapper[4965]: I1125 15:09:32.934640 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78e98b3d-733f-4b7a-abcf-950d6870c04f-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.353878 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9"] Nov 25 15:09:33 crc kubenswrapper[4965]: E1125 15:09:33.354311 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a927b1ca-0619-4f2f-89bd-5583da792645" containerName="installer" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.354341 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="a927b1ca-0619-4f2f-89bd-5583da792645" containerName="installer" Nov 25 15:09:33 crc kubenswrapper[4965]: E1125 15:09:33.354370 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.354429 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 15:09:33 crc kubenswrapper[4965]: E1125 15:09:33.354469 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77477a76-df54-4755-89b0-9b2ec40e098d" containerName="controller-manager" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.354487 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="77477a76-df54-4755-89b0-9b2ec40e098d" containerName="controller-manager" Nov 25 15:09:33 crc kubenswrapper[4965]: E1125 15:09:33.354524 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78e98b3d-733f-4b7a-abcf-950d6870c04f" containerName="route-controller-manager" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.354541 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="78e98b3d-733f-4b7a-abcf-950d6870c04f" containerName="route-controller-manager" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.354760 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="78e98b3d-733f-4b7a-abcf-950d6870c04f" containerName="route-controller-manager" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.354791 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.354816 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="a927b1ca-0619-4f2f-89bd-5583da792645" containerName="installer" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.354849 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="77477a76-df54-4755-89b0-9b2ec40e098d" containerName="controller-manager" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.355759 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.357619 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc"] Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.358293 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.359146 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.359372 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.359391 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.359956 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.361848 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.362108 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.362134 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.362305 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.363137 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.363450 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.363658 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.363692 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.377438 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.379059 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc"] Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.386021 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9"] Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.542985 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqbx\" (UniqueName: \"kubernetes.io/projected/87b1e5c4-d443-433e-8cea-91823fa16730-kube-api-access-9vqbx\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.543038 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h4pj\" (UniqueName: \"kubernetes.io/projected/f7752370-de26-4d5f-9c51-edd1d3ff6328-kube-api-access-6h4pj\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.543081 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-client-ca\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.543103 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7752370-de26-4d5f-9c51-edd1d3ff6328-serving-cert\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.543121 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-config\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.543141 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-proxy-ca-bundles\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.543156 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-config\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.543172 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-client-ca\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.543224 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87b1e5c4-d443-433e-8cea-91823fa16730-serving-cert\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.645151 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-client-ca\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.645204 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7752370-de26-4d5f-9c51-edd1d3ff6328-serving-cert\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.645231 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-config\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.645257 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-proxy-ca-bundles\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.645278 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-config\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.645296 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-client-ca\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.645324 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87b1e5c4-d443-433e-8cea-91823fa16730-serving-cert\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.645364 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqbx\" (UniqueName: \"kubernetes.io/projected/87b1e5c4-d443-433e-8cea-91823fa16730-kube-api-access-9vqbx\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.645402 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h4pj\" (UniqueName: \"kubernetes.io/projected/f7752370-de26-4d5f-9c51-edd1d3ff6328-kube-api-access-6h4pj\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.646654 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-client-ca\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.648384 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-config\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.649365 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-config\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.650484 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-proxy-ca-bundles\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.651541 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-client-ca\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.663821 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7752370-de26-4d5f-9c51-edd1d3ff6328-serving-cert\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.663850 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87b1e5c4-d443-433e-8cea-91823fa16730-serving-cert\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.673345 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqbx\" (UniqueName: \"kubernetes.io/projected/87b1e5c4-d443-433e-8cea-91823fa16730-kube-api-access-9vqbx\") pod \"controller-manager-64f4b76bd8-8cxq9\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.697683 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h4pj\" (UniqueName: \"kubernetes.io/projected/f7752370-de26-4d5f-9c51-edd1d3ff6328-kube-api-access-6h4pj\") pod \"route-controller-manager-c59b74f88-8mgpc\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.706232 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:33 crc kubenswrapper[4965]: I1125 15:09:33.713239 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.132417 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9"] Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.165756 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc"] Nov 25 15:09:34 crc kubenswrapper[4965]: W1125 15:09:34.171534 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7752370_de26_4d5f_9c51_edd1d3ff6328.slice/crio-2b2b1f1425508aa781a04050e309d4dafcf966eb726f64afe262b5ce534c0ba1 WatchSource:0}: Error finding container 2b2b1f1425508aa781a04050e309d4dafcf966eb726f64afe262b5ce534c0ba1: Status 404 returned error can't find the container with id 2b2b1f1425508aa781a04050e309d4dafcf966eb726f64afe262b5ce534c0ba1 Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.349521 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9"] Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.378574 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc"] Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.779869 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77477a76-df54-4755-89b0-9b2ec40e098d" path="/var/lib/kubelet/pods/77477a76-df54-4755-89b0-9b2ec40e098d/volumes" Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.780401 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78e98b3d-733f-4b7a-abcf-950d6870c04f" path="/var/lib/kubelet/pods/78e98b3d-733f-4b7a-abcf-950d6870c04f/volumes" Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.892960 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" event={"ID":"87b1e5c4-d443-433e-8cea-91823fa16730","Type":"ContainerStarted","Data":"870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020"} Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.893060 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" event={"ID":"87b1e5c4-d443-433e-8cea-91823fa16730","Type":"ContainerStarted","Data":"4d179843ec86369a28f3efa682c30445440fe1a9d14699daf0dde70f79455449"} Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.894137 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.895849 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" event={"ID":"f7752370-de26-4d5f-9c51-edd1d3ff6328","Type":"ContainerStarted","Data":"756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282"} Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.895886 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" event={"ID":"f7752370-de26-4d5f-9c51-edd1d3ff6328","Type":"ContainerStarted","Data":"2b2b1f1425508aa781a04050e309d4dafcf966eb726f64afe262b5ce534c0ba1"} Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.896485 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.899914 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.901302 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.937292 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" podStartSLOduration=2.937272546 podStartE2EDuration="2.937272546s" podCreationTimestamp="2025-11-25 15:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:09:34.91658891 +0000 UTC m=+319.884182656" watchObservedRunningTime="2025-11-25 15:09:34.937272546 +0000 UTC m=+319.904866292" Nov 25 15:09:34 crc kubenswrapper[4965]: I1125 15:09:34.986992 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" podStartSLOduration=2.986961334 podStartE2EDuration="2.986961334s" podCreationTimestamp="2025-11-25 15:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:09:34.984695431 +0000 UTC m=+319.952289177" watchObservedRunningTime="2025-11-25 15:09:34.986961334 +0000 UTC m=+319.954555080" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.002744 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fvgv6"] Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.003045 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fvgv6" podUID="d950b336-b79c-4b02-a695-66f4757027ca" containerName="registry-server" containerID="cri-o://2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d" gracePeriod=2 Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.203587 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zt5k2"] Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.210335 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zt5k2" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerName="registry-server" containerID="cri-o://e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27" gracePeriod=2 Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.321350 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.388349 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-catalog-content\") pod \"d950b336-b79c-4b02-a695-66f4757027ca\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.388456 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-utilities\") pod \"d950b336-b79c-4b02-a695-66f4757027ca\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.388539 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnvk6\" (UniqueName: \"kubernetes.io/projected/d950b336-b79c-4b02-a695-66f4757027ca-kube-api-access-bnvk6\") pod \"d950b336-b79c-4b02-a695-66f4757027ca\" (UID: \"d950b336-b79c-4b02-a695-66f4757027ca\") " Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.389524 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-utilities" (OuterVolumeSpecName: "utilities") pod "d950b336-b79c-4b02-a695-66f4757027ca" (UID: "d950b336-b79c-4b02-a695-66f4757027ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.409720 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d950b336-b79c-4b02-a695-66f4757027ca-kube-api-access-bnvk6" (OuterVolumeSpecName: "kube-api-access-bnvk6") pod "d950b336-b79c-4b02-a695-66f4757027ca" (UID: "d950b336-b79c-4b02-a695-66f4757027ca"). InnerVolumeSpecName "kube-api-access-bnvk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.454523 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d950b336-b79c-4b02-a695-66f4757027ca" (UID: "d950b336-b79c-4b02-a695-66f4757027ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.489650 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.489684 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d950b336-b79c-4b02-a695-66f4757027ca-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.489698 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnvk6\" (UniqueName: \"kubernetes.io/projected/d950b336-b79c-4b02-a695-66f4757027ca-kube-api-access-bnvk6\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.565388 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.590874 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh2g4\" (UniqueName: \"kubernetes.io/projected/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-kube-api-access-qh2g4\") pod \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.590935 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-catalog-content\") pod \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.591368 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-utilities\") pod \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\" (UID: \"0316a5a7-25ae-44be-ab7e-f3499e04aa8e\") " Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.592531 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-utilities" (OuterVolumeSpecName: "utilities") pod "0316a5a7-25ae-44be-ab7e-f3499e04aa8e" (UID: "0316a5a7-25ae-44be-ab7e-f3499e04aa8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.602406 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-kube-api-access-qh2g4" (OuterVolumeSpecName: "kube-api-access-qh2g4") pod "0316a5a7-25ae-44be-ab7e-f3499e04aa8e" (UID: "0316a5a7-25ae-44be-ab7e-f3499e04aa8e"). InnerVolumeSpecName "kube-api-access-qh2g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.651670 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0316a5a7-25ae-44be-ab7e-f3499e04aa8e" (UID: "0316a5a7-25ae-44be-ab7e-f3499e04aa8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.693246 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.693289 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh2g4\" (UniqueName: \"kubernetes.io/projected/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-kube-api-access-qh2g4\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.693301 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0316a5a7-25ae-44be-ab7e-f3499e04aa8e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.904870 4965 generic.go:334] "Generic (PLEG): container finished" podID="d950b336-b79c-4b02-a695-66f4757027ca" containerID="2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d" exitCode=0 Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.904907 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvgv6" event={"ID":"d950b336-b79c-4b02-a695-66f4757027ca","Type":"ContainerDied","Data":"2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d"} Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.904987 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvgv6" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.905008 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvgv6" event={"ID":"d950b336-b79c-4b02-a695-66f4757027ca","Type":"ContainerDied","Data":"7400c79cc0b4908206074b8336dd7ff9980fc5614b36e0ac13a3c57f26c78abd"} Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.905040 4965 scope.go:117] "RemoveContainer" containerID="2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.908133 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt5k2" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.908190 4965 generic.go:334] "Generic (PLEG): container finished" podID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerID="e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27" exitCode=0 Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.908195 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt5k2" event={"ID":"0316a5a7-25ae-44be-ab7e-f3499e04aa8e","Type":"ContainerDied","Data":"e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27"} Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.908252 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt5k2" event={"ID":"0316a5a7-25ae-44be-ab7e-f3499e04aa8e","Type":"ContainerDied","Data":"b7da68ca6d241602861c4f0d6370bfb7cb940970a9c96f5542fc0d733f9bdc01"} Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.908673 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" podUID="f7752370-de26-4d5f-9c51-edd1d3ff6328" containerName="route-controller-manager" containerID="cri-o://756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282" gracePeriod=30 Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.908565 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" podUID="87b1e5c4-d443-433e-8cea-91823fa16730" containerName="controller-manager" containerID="cri-o://870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020" gracePeriod=30 Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.934239 4965 scope.go:117] "RemoveContainer" containerID="63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.966566 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fvgv6"] Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.971060 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fvgv6"] Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.979067 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zt5k2"] Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.981360 4965 scope.go:117] "RemoveContainer" containerID="9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1" Nov 25 15:09:35 crc kubenswrapper[4965]: I1125 15:09:35.984332 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zt5k2"] Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.001245 4965 scope.go:117] "RemoveContainer" containerID="2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d" Nov 25 15:09:36 crc kubenswrapper[4965]: E1125 15:09:36.001793 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d\": container with ID starting with 2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d not found: ID does not exist" containerID="2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.001844 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d"} err="failed to get container status \"2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d\": rpc error: code = NotFound desc = could not find container \"2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d\": container with ID starting with 2627e60833ce7a30670004aded049fe140cca6efcf2aa20bb79f3de074dc571d not found: ID does not exist" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.001881 4965 scope.go:117] "RemoveContainer" containerID="63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc" Nov 25 15:09:36 crc kubenswrapper[4965]: E1125 15:09:36.002320 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc\": container with ID starting with 63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc not found: ID does not exist" containerID="63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.002351 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc"} err="failed to get container status \"63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc\": rpc error: code = NotFound desc = could not find container \"63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc\": container with ID starting with 63e37835853332cb0a6e04ca9f345389451afc708734259380a9c464d135befc not found: ID does not exist" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.002369 4965 scope.go:117] "RemoveContainer" containerID="9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1" Nov 25 15:09:36 crc kubenswrapper[4965]: E1125 15:09:36.002632 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1\": container with ID starting with 9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1 not found: ID does not exist" containerID="9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.002669 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1"} err="failed to get container status \"9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1\": rpc error: code = NotFound desc = could not find container \"9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1\": container with ID starting with 9fd7305d39aa0462b8d7a6e759891c39d6807388a9a8652eb96e09036df248d1 not found: ID does not exist" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.002697 4965 scope.go:117] "RemoveContainer" containerID="e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.109069 4965 scope.go:117] "RemoveContainer" containerID="3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.207598 4965 scope.go:117] "RemoveContainer" containerID="2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.229114 4965 scope.go:117] "RemoveContainer" containerID="e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27" Nov 25 15:09:36 crc kubenswrapper[4965]: E1125 15:09:36.234328 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27\": container with ID starting with e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27 not found: ID does not exist" containerID="e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.234365 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27"} err="failed to get container status \"e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27\": rpc error: code = NotFound desc = could not find container \"e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27\": container with ID starting with e5d0cd06ec4cf30a1046eb7fe171cee5e24a9de9915d2ac3d3d15c3d39af8d27 not found: ID does not exist" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.234390 4965 scope.go:117] "RemoveContainer" containerID="3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b" Nov 25 15:09:36 crc kubenswrapper[4965]: E1125 15:09:36.236220 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b\": container with ID starting with 3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b not found: ID does not exist" containerID="3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.236245 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b"} err="failed to get container status \"3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b\": rpc error: code = NotFound desc = could not find container \"3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b\": container with ID starting with 3ae52e2ff0b58e053f54d3791aa2b2361c62eee3400034b2ccda9dc6f8a7e83b not found: ID does not exist" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.236261 4965 scope.go:117] "RemoveContainer" containerID="2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7" Nov 25 15:09:36 crc kubenswrapper[4965]: E1125 15:09:36.240292 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7\": container with ID starting with 2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7 not found: ID does not exist" containerID="2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.240327 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7"} err="failed to get container status \"2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7\": rpc error: code = NotFound desc = could not find container \"2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7\": container with ID starting with 2a79ebbf9d04ad7adc7569515866ecc6a0b03168cc426a8349486081ed6636d7 not found: ID does not exist" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.457677 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.461157 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.506154 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vqbx\" (UniqueName: \"kubernetes.io/projected/87b1e5c4-d443-433e-8cea-91823fa16730-kube-api-access-9vqbx\") pod \"87b1e5c4-d443-433e-8cea-91823fa16730\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.506224 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-config\") pod \"87b1e5c4-d443-433e-8cea-91823fa16730\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.506312 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-proxy-ca-bundles\") pod \"87b1e5c4-d443-433e-8cea-91823fa16730\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.506897 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-config" (OuterVolumeSpecName: "config") pod "87b1e5c4-d443-433e-8cea-91823fa16730" (UID: "87b1e5c4-d443-433e-8cea-91823fa16730"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.507016 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "87b1e5c4-d443-433e-8cea-91823fa16730" (UID: "87b1e5c4-d443-433e-8cea-91823fa16730"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.506345 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-client-ca\") pod \"f7752370-de26-4d5f-9c51-edd1d3ff6328\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.507071 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-client-ca" (OuterVolumeSpecName: "client-ca") pod "f7752370-de26-4d5f-9c51-edd1d3ff6328" (UID: "f7752370-de26-4d5f-9c51-edd1d3ff6328"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.507094 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7752370-de26-4d5f-9c51-edd1d3ff6328-serving-cert\") pod \"f7752370-de26-4d5f-9c51-edd1d3ff6328\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.507623 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-config\") pod \"f7752370-de26-4d5f-9c51-edd1d3ff6328\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.507689 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87b1e5c4-d443-433e-8cea-91823fa16730-serving-cert\") pod \"87b1e5c4-d443-433e-8cea-91823fa16730\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.507730 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-client-ca\") pod \"87b1e5c4-d443-433e-8cea-91823fa16730\" (UID: \"87b1e5c4-d443-433e-8cea-91823fa16730\") " Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.507786 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h4pj\" (UniqueName: \"kubernetes.io/projected/f7752370-de26-4d5f-9c51-edd1d3ff6328-kube-api-access-6h4pj\") pod \"f7752370-de26-4d5f-9c51-edd1d3ff6328\" (UID: \"f7752370-de26-4d5f-9c51-edd1d3ff6328\") " Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.508022 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-config" (OuterVolumeSpecName: "config") pod "f7752370-de26-4d5f-9c51-edd1d3ff6328" (UID: "f7752370-de26-4d5f-9c51-edd1d3ff6328"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.508148 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.508161 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.508172 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.508189 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7752370-de26-4d5f-9c51-edd1d3ff6328-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.508349 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-client-ca" (OuterVolumeSpecName: "client-ca") pod "87b1e5c4-d443-433e-8cea-91823fa16730" (UID: "87b1e5c4-d443-433e-8cea-91823fa16730"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.514120 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7752370-de26-4d5f-9c51-edd1d3ff6328-kube-api-access-6h4pj" (OuterVolumeSpecName: "kube-api-access-6h4pj") pod "f7752370-de26-4d5f-9c51-edd1d3ff6328" (UID: "f7752370-de26-4d5f-9c51-edd1d3ff6328"). InnerVolumeSpecName "kube-api-access-6h4pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.520113 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b1e5c4-d443-433e-8cea-91823fa16730-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "87b1e5c4-d443-433e-8cea-91823fa16730" (UID: "87b1e5c4-d443-433e-8cea-91823fa16730"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.528230 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b1e5c4-d443-433e-8cea-91823fa16730-kube-api-access-9vqbx" (OuterVolumeSpecName: "kube-api-access-9vqbx") pod "87b1e5c4-d443-433e-8cea-91823fa16730" (UID: "87b1e5c4-d443-433e-8cea-91823fa16730"). InnerVolumeSpecName "kube-api-access-9vqbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.529241 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7752370-de26-4d5f-9c51-edd1d3ff6328-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f7752370-de26-4d5f-9c51-edd1d3ff6328" (UID: "f7752370-de26-4d5f-9c51-edd1d3ff6328"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.609244 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7752370-de26-4d5f-9c51-edd1d3ff6328-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.609281 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87b1e5c4-d443-433e-8cea-91823fa16730-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.609290 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87b1e5c4-d443-433e-8cea-91823fa16730-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.609299 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h4pj\" (UniqueName: \"kubernetes.io/projected/f7752370-de26-4d5f-9c51-edd1d3ff6328-kube-api-access-6h4pj\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.609309 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vqbx\" (UniqueName: \"kubernetes.io/projected/87b1e5c4-d443-433e-8cea-91823fa16730-kube-api-access-9vqbx\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.777636 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" path="/var/lib/kubelet/pods/0316a5a7-25ae-44be-ab7e-f3499e04aa8e/volumes" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.778646 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d950b336-b79c-4b02-a695-66f4757027ca" path="/var/lib/kubelet/pods/d950b336-b79c-4b02-a695-66f4757027ca/volumes" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.918583 4965 generic.go:334] "Generic (PLEG): container finished" podID="87b1e5c4-d443-433e-8cea-91823fa16730" containerID="870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020" exitCode=0 Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.918651 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" event={"ID":"87b1e5c4-d443-433e-8cea-91823fa16730","Type":"ContainerDied","Data":"870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020"} Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.918652 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.918683 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9" event={"ID":"87b1e5c4-d443-433e-8cea-91823fa16730","Type":"ContainerDied","Data":"4d179843ec86369a28f3efa682c30445440fe1a9d14699daf0dde70f79455449"} Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.918722 4965 scope.go:117] "RemoveContainer" containerID="870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.921350 4965 generic.go:334] "Generic (PLEG): container finished" podID="f7752370-de26-4d5f-9c51-edd1d3ff6328" containerID="756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282" exitCode=0 Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.921407 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" event={"ID":"f7752370-de26-4d5f-9c51-edd1d3ff6328","Type":"ContainerDied","Data":"756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282"} Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.921429 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" event={"ID":"f7752370-de26-4d5f-9c51-edd1d3ff6328","Type":"ContainerDied","Data":"2b2b1f1425508aa781a04050e309d4dafcf966eb726f64afe262b5ce534c0ba1"} Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.921487 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.943520 4965 scope.go:117] "RemoveContainer" containerID="870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020" Nov 25 15:09:36 crc kubenswrapper[4965]: E1125 15:09:36.943989 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020\": container with ID starting with 870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020 not found: ID does not exist" containerID="870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.944029 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020"} err="failed to get container status \"870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020\": rpc error: code = NotFound desc = could not find container \"870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020\": container with ID starting with 870db0554da7ef9d168e938f8547480b2dce0f175d1be115b92acf26570d5020 not found: ID does not exist" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.944053 4965 scope.go:117] "RemoveContainer" containerID="756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.946622 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc"] Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.954938 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59b74f88-8mgpc"] Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.959821 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9"] Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.960902 4965 scope.go:117] "RemoveContainer" containerID="756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282" Nov 25 15:09:36 crc kubenswrapper[4965]: E1125 15:09:36.961345 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282\": container with ID starting with 756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282 not found: ID does not exist" containerID="756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.961390 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282"} err="failed to get container status \"756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282\": rpc error: code = NotFound desc = could not find container \"756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282\": container with ID starting with 756dc1c5c93ba3ea474e731244536ec03e8032e32f1802c702d5f8e370292282 not found: ID does not exist" Nov 25 15:09:36 crc kubenswrapper[4965]: I1125 15:09:36.964590 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64f4b76bd8-8cxq9"] Nov 25 15:09:37 crc kubenswrapper[4965]: I1125 15:09:37.604991 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9kt4"] Nov 25 15:09:37 crc kubenswrapper[4965]: I1125 15:09:37.605369 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g9kt4" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerName="registry-server" containerID="cri-o://2f1eeae762f660161d50f777adf757128f6476ed38191443e3367c46659183de" gracePeriod=2 Nov 25 15:09:37 crc kubenswrapper[4965]: I1125 15:09:37.934808 4965 generic.go:334] "Generic (PLEG): container finished" podID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerID="2f1eeae762f660161d50f777adf757128f6476ed38191443e3367c46659183de" exitCode=0 Nov 25 15:09:37 crc kubenswrapper[4965]: I1125 15:09:37.934891 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9kt4" event={"ID":"be82ef6f-bd03-4c9f-a760-d836fccf52a7","Type":"ContainerDied","Data":"2f1eeae762f660161d50f777adf757128f6476ed38191443e3367c46659183de"} Nov 25 15:09:37 crc kubenswrapper[4965]: I1125 15:09:37.934946 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9kt4" event={"ID":"be82ef6f-bd03-4c9f-a760-d836fccf52a7","Type":"ContainerDied","Data":"aee938dc4ed33beb99edd36f3fac89153eaa8eec16d34ab027c7251ddfae1e30"} Nov 25 15:09:37 crc kubenswrapper[4965]: I1125 15:09:37.934997 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aee938dc4ed33beb99edd36f3fac89153eaa8eec16d34ab027c7251ddfae1e30" Nov 25 15:09:37 crc kubenswrapper[4965]: I1125 15:09:37.957713 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.026906 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhvtz\" (UniqueName: \"kubernetes.io/projected/be82ef6f-bd03-4c9f-a760-d836fccf52a7-kube-api-access-zhvtz\") pod \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.026999 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-utilities\") pod \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.027042 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-catalog-content\") pod \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\" (UID: \"be82ef6f-bd03-4c9f-a760-d836fccf52a7\") " Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.027747 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-utilities" (OuterVolumeSpecName: "utilities") pod "be82ef6f-bd03-4c9f-a760-d836fccf52a7" (UID: "be82ef6f-bd03-4c9f-a760-d836fccf52a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.031431 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be82ef6f-bd03-4c9f-a760-d836fccf52a7-kube-api-access-zhvtz" (OuterVolumeSpecName: "kube-api-access-zhvtz") pod "be82ef6f-bd03-4c9f-a760-d836fccf52a7" (UID: "be82ef6f-bd03-4c9f-a760-d836fccf52a7"). InnerVolumeSpecName "kube-api-access-zhvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.045938 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be82ef6f-bd03-4c9f-a760-d836fccf52a7" (UID: "be82ef6f-bd03-4c9f-a760-d836fccf52a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.127991 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.128031 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhvtz\" (UniqueName: \"kubernetes.io/projected/be82ef6f-bd03-4c9f-a760-d836fccf52a7-kube-api-access-zhvtz\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.128043 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be82ef6f-bd03-4c9f-a760-d836fccf52a7-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.783384 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87b1e5c4-d443-433e-8cea-91823fa16730" path="/var/lib/kubelet/pods/87b1e5c4-d443-433e-8cea-91823fa16730/volumes" Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.786590 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7752370-de26-4d5f-9c51-edd1d3ff6328" path="/var/lib/kubelet/pods/f7752370-de26-4d5f-9c51-edd1d3ff6328/volumes" Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.938672 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9kt4" Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.955636 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9kt4"] Nov 25 15:09:38 crc kubenswrapper[4965]: I1125 15:09:38.959870 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9kt4"] Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.362899 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z"] Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363185 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerName="registry-server" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363204 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerName="registry-server" Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363218 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerName="extract-utilities" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363230 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerName="extract-utilities" Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363241 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerName="extract-content" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363249 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerName="extract-content" Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363261 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d950b336-b79c-4b02-a695-66f4757027ca" containerName="registry-server" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363269 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d950b336-b79c-4b02-a695-66f4757027ca" containerName="registry-server" Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363278 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7752370-de26-4d5f-9c51-edd1d3ff6328" containerName="route-controller-manager" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363288 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7752370-de26-4d5f-9c51-edd1d3ff6328" containerName="route-controller-manager" Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363302 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerName="registry-server" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363310 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerName="registry-server" Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363319 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d950b336-b79c-4b02-a695-66f4757027ca" containerName="extract-content" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363326 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d950b336-b79c-4b02-a695-66f4757027ca" containerName="extract-content" Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363339 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerName="extract-utilities" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363347 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerName="extract-utilities" Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363361 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d950b336-b79c-4b02-a695-66f4757027ca" containerName="extract-utilities" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363369 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d950b336-b79c-4b02-a695-66f4757027ca" containerName="extract-utilities" Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363379 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b1e5c4-d443-433e-8cea-91823fa16730" containerName="controller-manager" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363386 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b1e5c4-d443-433e-8cea-91823fa16730" containerName="controller-manager" Nov 25 15:09:39 crc kubenswrapper[4965]: E1125 15:09:39.363396 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerName="extract-content" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363403 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerName="extract-content" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363516 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7752370-de26-4d5f-9c51-edd1d3ff6328" containerName="route-controller-manager" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363531 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b1e5c4-d443-433e-8cea-91823fa16730" containerName="controller-manager" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363540 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="d950b336-b79c-4b02-a695-66f4757027ca" containerName="registry-server" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363556 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" containerName="registry-server" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.363566 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="0316a5a7-25ae-44be-ab7e-f3499e04aa8e" containerName="registry-server" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.364220 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.368216 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-kq4c6"] Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.368797 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.368945 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.369031 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.369507 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.369736 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.369905 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.372341 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.374787 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.375272 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.375445 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.375609 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.375861 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.377038 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z"] Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.377181 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.390640 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.394743 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-kq4c6"] Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.442638 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-client-ca\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.442684 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d54f414-c740-438f-9337-9ba0d65db07b-serving-cert\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.442707 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-client-ca\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.442730 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-proxy-ca-bundles\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.442841 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-config\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.443015 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-config\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.443093 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwm5g\" (UniqueName: \"kubernetes.io/projected/0d54f414-c740-438f-9337-9ba0d65db07b-kube-api-access-qwm5g\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.443126 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bafdade-fd9c-4ada-9263-a30532d21c32-serving-cert\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.443161 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lr6\" (UniqueName: \"kubernetes.io/projected/5bafdade-fd9c-4ada-9263-a30532d21c32-kube-api-access-m2lr6\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.543939 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-client-ca\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.544010 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d54f414-c740-438f-9337-9ba0d65db07b-serving-cert\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.544031 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-client-ca\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.544050 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-proxy-ca-bundles\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.544075 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-config\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.544094 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-config\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.544127 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwm5g\" (UniqueName: \"kubernetes.io/projected/0d54f414-c740-438f-9337-9ba0d65db07b-kube-api-access-qwm5g\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.544151 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bafdade-fd9c-4ada-9263-a30532d21c32-serving-cert\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.544173 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lr6\" (UniqueName: \"kubernetes.io/projected/5bafdade-fd9c-4ada-9263-a30532d21c32-kube-api-access-m2lr6\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.545457 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-proxy-ca-bundles\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.545467 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-client-ca\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.545644 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-config\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.545656 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-client-ca\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.545726 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-config\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.559820 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bafdade-fd9c-4ada-9263-a30532d21c32-serving-cert\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.562876 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lr6\" (UniqueName: \"kubernetes.io/projected/5bafdade-fd9c-4ada-9263-a30532d21c32-kube-api-access-m2lr6\") pod \"route-controller-manager-657648d9fc-9lg2z\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.563726 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d54f414-c740-438f-9337-9ba0d65db07b-serving-cert\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.565398 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwm5g\" (UniqueName: \"kubernetes.io/projected/0d54f414-c740-438f-9337-9ba0d65db07b-kube-api-access-qwm5g\") pod \"controller-manager-5f784d6689-kq4c6\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.687149 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.703503 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.898433 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-kq4c6"] Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.946753 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z"] Nov 25 15:09:39 crc kubenswrapper[4965]: I1125 15:09:39.955466 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" event={"ID":"0d54f414-c740-438f-9337-9ba0d65db07b","Type":"ContainerStarted","Data":"a624759e32d306555de1d0c4487e3d77ee32ea457ce597714d8d158a269fe16c"} Nov 25 15:09:40 crc kubenswrapper[4965]: I1125 15:09:40.778047 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be82ef6f-bd03-4c9f-a760-d836fccf52a7" path="/var/lib/kubelet/pods/be82ef6f-bd03-4c9f-a760-d836fccf52a7/volumes" Nov 25 15:09:40 crc kubenswrapper[4965]: I1125 15:09:40.965676 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" event={"ID":"0d54f414-c740-438f-9337-9ba0d65db07b","Type":"ContainerStarted","Data":"0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d"} Nov 25 15:09:40 crc kubenswrapper[4965]: I1125 15:09:40.967011 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:40 crc kubenswrapper[4965]: I1125 15:09:40.971223 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" event={"ID":"5bafdade-fd9c-4ada-9263-a30532d21c32","Type":"ContainerStarted","Data":"74f5f8c2fd49cb6b7b4180585a02ab93122e1dc20b02bb9811164153bacfe218"} Nov 25 15:09:40 crc kubenswrapper[4965]: I1125 15:09:40.971604 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" event={"ID":"5bafdade-fd9c-4ada-9263-a30532d21c32","Type":"ContainerStarted","Data":"9c0d15952094489101445da2cdf88700efb147b9c4d0e929cc0e705c4ffeac83"} Nov 25 15:09:40 crc kubenswrapper[4965]: I1125 15:09:40.971733 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:40 crc kubenswrapper[4965]: I1125 15:09:40.973777 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:40 crc kubenswrapper[4965]: I1125 15:09:40.976376 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:09:41 crc kubenswrapper[4965]: I1125 15:09:41.001681 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" podStartSLOduration=7.001663482 podStartE2EDuration="7.001663482s" podCreationTimestamp="2025-11-25 15:09:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:09:40.983083834 +0000 UTC m=+325.950677580" watchObservedRunningTime="2025-11-25 15:09:41.001663482 +0000 UTC m=+325.969257228" Nov 25 15:09:41 crc kubenswrapper[4965]: I1125 15:09:41.002865 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" podStartSLOduration=7.002858524 podStartE2EDuration="7.002858524s" podCreationTimestamp="2025-11-25 15:09:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:09:40.999628136 +0000 UTC m=+325.967221882" watchObservedRunningTime="2025-11-25 15:09:41.002858524 +0000 UTC m=+325.970452270" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.146139 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6t6zq"] Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.147477 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.162867 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6t6zq"] Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.191234 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd082883-2bfc-4baf-be6e-531ea0b41986-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.191338 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd082883-2bfc-4baf-be6e-531ea0b41986-registry-certificates\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.191420 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.191450 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd082883-2bfc-4baf-be6e-531ea0b41986-registry-tls\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.192004 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd082883-2bfc-4baf-be6e-531ea0b41986-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.192117 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd082883-2bfc-4baf-be6e-531ea0b41986-bound-sa-token\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.192149 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd082883-2bfc-4baf-be6e-531ea0b41986-trusted-ca\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.192187 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s95rq\" (UniqueName: \"kubernetes.io/projected/cd082883-2bfc-4baf-be6e-531ea0b41986-kube-api-access-s95rq\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.232166 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.259802 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-kq4c6"] Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.260064 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" podUID="0d54f414-c740-438f-9337-9ba0d65db07b" containerName="controller-manager" containerID="cri-o://0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d" gracePeriod=30 Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.293372 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd082883-2bfc-4baf-be6e-531ea0b41986-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.293443 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd082883-2bfc-4baf-be6e-531ea0b41986-trusted-ca\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.293459 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd082883-2bfc-4baf-be6e-531ea0b41986-bound-sa-token\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.293482 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s95rq\" (UniqueName: \"kubernetes.io/projected/cd082883-2bfc-4baf-be6e-531ea0b41986-kube-api-access-s95rq\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.293505 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd082883-2bfc-4baf-be6e-531ea0b41986-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.293523 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd082883-2bfc-4baf-be6e-531ea0b41986-registry-certificates\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.293546 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd082883-2bfc-4baf-be6e-531ea0b41986-registry-tls\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.295261 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd082883-2bfc-4baf-be6e-531ea0b41986-registry-certificates\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.295564 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd082883-2bfc-4baf-be6e-531ea0b41986-trusted-ca\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.295577 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd082883-2bfc-4baf-be6e-531ea0b41986-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.300765 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd082883-2bfc-4baf-be6e-531ea0b41986-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.302961 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd082883-2bfc-4baf-be6e-531ea0b41986-registry-tls\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.311045 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd082883-2bfc-4baf-be6e-531ea0b41986-bound-sa-token\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.311576 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s95rq\" (UniqueName: \"kubernetes.io/projected/cd082883-2bfc-4baf-be6e-531ea0b41986-kube-api-access-s95rq\") pod \"image-registry-66df7c8f76-6t6zq\" (UID: \"cd082883-2bfc-4baf-be6e-531ea0b41986\") " pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.471526 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.763721 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.804755 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-client-ca\") pod \"0d54f414-c740-438f-9337-9ba0d65db07b\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.804833 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-proxy-ca-bundles\") pod \"0d54f414-c740-438f-9337-9ba0d65db07b\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.804863 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-config\") pod \"0d54f414-c740-438f-9337-9ba0d65db07b\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.804886 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d54f414-c740-438f-9337-9ba0d65db07b-serving-cert\") pod \"0d54f414-c740-438f-9337-9ba0d65db07b\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.804962 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwm5g\" (UniqueName: \"kubernetes.io/projected/0d54f414-c740-438f-9337-9ba0d65db07b-kube-api-access-qwm5g\") pod \"0d54f414-c740-438f-9337-9ba0d65db07b\" (UID: \"0d54f414-c740-438f-9337-9ba0d65db07b\") " Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.805755 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0d54f414-c740-438f-9337-9ba0d65db07b" (UID: "0d54f414-c740-438f-9337-9ba0d65db07b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.805812 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-config" (OuterVolumeSpecName: "config") pod "0d54f414-c740-438f-9337-9ba0d65db07b" (UID: "0d54f414-c740-438f-9337-9ba0d65db07b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.806375 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d54f414-c740-438f-9337-9ba0d65db07b" (UID: "0d54f414-c740-438f-9337-9ba0d65db07b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.808513 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d54f414-c740-438f-9337-9ba0d65db07b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d54f414-c740-438f-9337-9ba0d65db07b" (UID: "0d54f414-c740-438f-9337-9ba0d65db07b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.812070 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d54f414-c740-438f-9337-9ba0d65db07b-kube-api-access-qwm5g" (OuterVolumeSpecName: "kube-api-access-qwm5g") pod "0d54f414-c740-438f-9337-9ba0d65db07b" (UID: "0d54f414-c740-438f-9337-9ba0d65db07b"). InnerVolumeSpecName "kube-api-access-qwm5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.902579 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6t6zq"] Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.905916 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.905938 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.905948 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d54f414-c740-438f-9337-9ba0d65db07b-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.905957 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d54f414-c740-438f-9337-9ba0d65db07b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:52 crc kubenswrapper[4965]: I1125 15:09:52.905978 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwm5g\" (UniqueName: \"kubernetes.io/projected/0d54f414-c740-438f-9337-9ba0d65db07b-kube-api-access-qwm5g\") on node \"crc\" DevicePath \"\"" Nov 25 15:09:52 crc kubenswrapper[4965]: W1125 15:09:52.908039 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd082883_2bfc_4baf_be6e_531ea0b41986.slice/crio-ff555333582232832ad19a772ce8de593667179b3e0d24c5498340bc12ffd1c9 WatchSource:0}: Error finding container ff555333582232832ad19a772ce8de593667179b3e0d24c5498340bc12ffd1c9: Status 404 returned error can't find the container with id ff555333582232832ad19a772ce8de593667179b3e0d24c5498340bc12ffd1c9 Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.030356 4965 generic.go:334] "Generic (PLEG): container finished" podID="0d54f414-c740-438f-9337-9ba0d65db07b" containerID="0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d" exitCode=0 Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.030424 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" event={"ID":"0d54f414-c740-438f-9337-9ba0d65db07b","Type":"ContainerDied","Data":"0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d"} Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.030388 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.030497 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f784d6689-kq4c6" event={"ID":"0d54f414-c740-438f-9337-9ba0d65db07b","Type":"ContainerDied","Data":"a624759e32d306555de1d0c4487e3d77ee32ea457ce597714d8d158a269fe16c"} Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.030525 4965 scope.go:117] "RemoveContainer" containerID="0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.034925 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" event={"ID":"cd082883-2bfc-4baf-be6e-531ea0b41986","Type":"ContainerStarted","Data":"ff555333582232832ad19a772ce8de593667179b3e0d24c5498340bc12ffd1c9"} Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.051534 4965 scope.go:117] "RemoveContainer" containerID="0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d" Nov 25 15:09:53 crc kubenswrapper[4965]: E1125 15:09:53.052004 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d\": container with ID starting with 0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d not found: ID does not exist" containerID="0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.052091 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d"} err="failed to get container status \"0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d\": rpc error: code = NotFound desc = could not find container \"0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d\": container with ID starting with 0966f4bcc728581d2fd05c063119e20b724401ecbe63e27237e3c1fa04bcec4d not found: ID does not exist" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.076748 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-kq4c6"] Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.082103 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f784d6689-kq4c6"] Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.260217 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.260427 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.370954 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-795d7c65cb-mcc5s"] Nov 25 15:09:53 crc kubenswrapper[4965]: E1125 15:09:53.371388 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d54f414-c740-438f-9337-9ba0d65db07b" containerName="controller-manager" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.371452 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d54f414-c740-438f-9337-9ba0d65db07b" containerName="controller-manager" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.371609 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d54f414-c740-438f-9337-9ba0d65db07b" containerName="controller-manager" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.372035 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.374407 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.374509 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.374455 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.374679 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.376074 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.381403 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.386077 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.400238 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-795d7c65cb-mcc5s"] Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.412507 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4752d8b6-0256-4c09-8175-c72e976f33e3-config\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.412552 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4752d8b6-0256-4c09-8175-c72e976f33e3-proxy-ca-bundles\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.412592 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clsss\" (UniqueName: \"kubernetes.io/projected/4752d8b6-0256-4c09-8175-c72e976f33e3-kube-api-access-clsss\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.412619 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4752d8b6-0256-4c09-8175-c72e976f33e3-serving-cert\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.412643 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4752d8b6-0256-4c09-8175-c72e976f33e3-client-ca\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.513518 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clsss\" (UniqueName: \"kubernetes.io/projected/4752d8b6-0256-4c09-8175-c72e976f33e3-kube-api-access-clsss\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.514170 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4752d8b6-0256-4c09-8175-c72e976f33e3-serving-cert\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.514896 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4752d8b6-0256-4c09-8175-c72e976f33e3-client-ca\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.515104 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4752d8b6-0256-4c09-8175-c72e976f33e3-config\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.516478 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4752d8b6-0256-4c09-8175-c72e976f33e3-proxy-ca-bundles\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.516388 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4752d8b6-0256-4c09-8175-c72e976f33e3-config\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.515844 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4752d8b6-0256-4c09-8175-c72e976f33e3-client-ca\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.517585 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4752d8b6-0256-4c09-8175-c72e976f33e3-proxy-ca-bundles\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.519944 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4752d8b6-0256-4c09-8175-c72e976f33e3-serving-cert\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.540297 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clsss\" (UniqueName: \"kubernetes.io/projected/4752d8b6-0256-4c09-8175-c72e976f33e3-kube-api-access-clsss\") pod \"controller-manager-795d7c65cb-mcc5s\" (UID: \"4752d8b6-0256-4c09-8175-c72e976f33e3\") " pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.692148 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:53 crc kubenswrapper[4965]: W1125 15:09:53.929671 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4752d8b6_0256_4c09_8175_c72e976f33e3.slice/crio-d05faa6c7724ffcfdbca89b9f60c276cc1324826c99510dabb731bce07ca79e0 WatchSource:0}: Error finding container d05faa6c7724ffcfdbca89b9f60c276cc1324826c99510dabb731bce07ca79e0: Status 404 returned error can't find the container with id d05faa6c7724ffcfdbca89b9f60c276cc1324826c99510dabb731bce07ca79e0 Nov 25 15:09:53 crc kubenswrapper[4965]: I1125 15:09:53.931024 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-795d7c65cb-mcc5s"] Nov 25 15:09:54 crc kubenswrapper[4965]: I1125 15:09:54.043514 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" event={"ID":"cd082883-2bfc-4baf-be6e-531ea0b41986","Type":"ContainerStarted","Data":"63bc2ba9dc4b4c4bf841ad1eb67049079ea7b577e0ff022f6b0ed4658b00e97b"} Nov 25 15:09:54 crc kubenswrapper[4965]: I1125 15:09:54.043831 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:09:54 crc kubenswrapper[4965]: I1125 15:09:54.044811 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" event={"ID":"4752d8b6-0256-4c09-8175-c72e976f33e3","Type":"ContainerStarted","Data":"d05faa6c7724ffcfdbca89b9f60c276cc1324826c99510dabb731bce07ca79e0"} Nov 25 15:09:54 crc kubenswrapper[4965]: I1125 15:09:54.064513 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" podStartSLOduration=2.064486609 podStartE2EDuration="2.064486609s" podCreationTimestamp="2025-11-25 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:09:54.059415628 +0000 UTC m=+339.027009374" watchObservedRunningTime="2025-11-25 15:09:54.064486609 +0000 UTC m=+339.032080365" Nov 25 15:09:54 crc kubenswrapper[4965]: I1125 15:09:54.777076 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d54f414-c740-438f-9337-9ba0d65db07b" path="/var/lib/kubelet/pods/0d54f414-c740-438f-9337-9ba0d65db07b/volumes" Nov 25 15:09:55 crc kubenswrapper[4965]: I1125 15:09:55.052508 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" event={"ID":"4752d8b6-0256-4c09-8175-c72e976f33e3","Type":"ContainerStarted","Data":"e443f26e714b979e6ec61322a6c6557999a6a53cb175ea75290eecc71d148a88"} Nov 25 15:09:55 crc kubenswrapper[4965]: I1125 15:09:55.053015 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:55 crc kubenswrapper[4965]: I1125 15:09:55.060302 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" Nov 25 15:09:55 crc kubenswrapper[4965]: I1125 15:09:55.071574 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-795d7c65cb-mcc5s" podStartSLOduration=3.07154925 podStartE2EDuration="3.07154925s" podCreationTimestamp="2025-11-25 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:09:55.069070521 +0000 UTC m=+340.036664267" watchObservedRunningTime="2025-11-25 15:09:55.07154925 +0000 UTC m=+340.039142996" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.559021 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9jt9"] Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.567109 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6blkb"] Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.567756 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s9jt9" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerName="registry-server" containerID="cri-o://cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456" gracePeriod=30 Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.568251 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6blkb" podUID="79a7c128-3c65-491c-95b6-52337183df64" containerName="registry-server" containerID="cri-o://519627fa84fe75bca558edb39f2f6428755a1f9c4bd70537fc1675d218ba1059" gracePeriod=30 Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.575693 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nld56"] Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.577613 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" podUID="2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" containerName="marketplace-operator" containerID="cri-o://a9480a59fae99887ce186b7a3beca72fb8cfc2292c464ff86c3c9e85447cac18" gracePeriod=30 Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.586068 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndvs4"] Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.586314 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ndvs4" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerName="registry-server" containerID="cri-o://9fd377b309a2407202a91c7b56c3be8010cf1afe85c1deaa6821aa69f5199dab" gracePeriod=30 Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.603665 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-52qw9"] Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.604534 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.606461 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j8f94"] Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.606744 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j8f94" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerName="registry-server" containerID="cri-o://8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928" gracePeriod=30 Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.632362 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-52qw9"] Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.695633 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/86b81218-a04c-44e9-b4bc-efa18ee58d7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-52qw9\" (UID: \"86b81218-a04c-44e9-b4bc-efa18ee58d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.695681 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shhln\" (UniqueName: \"kubernetes.io/projected/86b81218-a04c-44e9-b4bc-efa18ee58d7e-kube-api-access-shhln\") pod \"marketplace-operator-79b997595-52qw9\" (UID: \"86b81218-a04c-44e9-b4bc-efa18ee58d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.695703 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86b81218-a04c-44e9-b4bc-efa18ee58d7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-52qw9\" (UID: \"86b81218-a04c-44e9-b4bc-efa18ee58d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.797619 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/86b81218-a04c-44e9-b4bc-efa18ee58d7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-52qw9\" (UID: \"86b81218-a04c-44e9-b4bc-efa18ee58d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.797710 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shhln\" (UniqueName: \"kubernetes.io/projected/86b81218-a04c-44e9-b4bc-efa18ee58d7e-kube-api-access-shhln\") pod \"marketplace-operator-79b997595-52qw9\" (UID: \"86b81218-a04c-44e9-b4bc-efa18ee58d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.797761 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86b81218-a04c-44e9-b4bc-efa18ee58d7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-52qw9\" (UID: \"86b81218-a04c-44e9-b4bc-efa18ee58d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.799505 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86b81218-a04c-44e9-b4bc-efa18ee58d7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-52qw9\" (UID: \"86b81218-a04c-44e9-b4bc-efa18ee58d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.810928 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/86b81218-a04c-44e9-b4bc-efa18ee58d7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-52qw9\" (UID: \"86b81218-a04c-44e9-b4bc-efa18ee58d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.814561 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shhln\" (UniqueName: \"kubernetes.io/projected/86b81218-a04c-44e9-b4bc-efa18ee58d7e-kube-api-access-shhln\") pod \"marketplace-operator-79b997595-52qw9\" (UID: \"86b81218-a04c-44e9-b4bc-efa18ee58d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:04 crc kubenswrapper[4965]: I1125 15:10:04.923595 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.083659 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.106422 4965 generic.go:334] "Generic (PLEG): container finished" podID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerID="cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456" exitCode=0 Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.106474 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9jt9" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.106490 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9jt9" event={"ID":"1269cb10-777f-46e4-a52f-8088e7af6b2d","Type":"ContainerDied","Data":"cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456"} Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.106521 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9jt9" event={"ID":"1269cb10-777f-46e4-a52f-8088e7af6b2d","Type":"ContainerDied","Data":"2402bfe97d6421b40be7d1bdf96b6779d2cb8ddeba0f9968a153576105522ddc"} Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.106538 4965 scope.go:117] "RemoveContainer" containerID="cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.109104 4965 generic.go:334] "Generic (PLEG): container finished" podID="2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" containerID="a9480a59fae99887ce186b7a3beca72fb8cfc2292c464ff86c3c9e85447cac18" exitCode=0 Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.109154 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" event={"ID":"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1","Type":"ContainerDied","Data":"a9480a59fae99887ce186b7a3beca72fb8cfc2292c464ff86c3c9e85447cac18"} Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.111619 4965 generic.go:334] "Generic (PLEG): container finished" podID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerID="8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928" exitCode=0 Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.111676 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8f94" event={"ID":"2fccc0df-85ec-4aeb-9217-00c37ea16e67","Type":"ContainerDied","Data":"8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928"} Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.113558 4965 generic.go:334] "Generic (PLEG): container finished" podID="79a7c128-3c65-491c-95b6-52337183df64" containerID="519627fa84fe75bca558edb39f2f6428755a1f9c4bd70537fc1675d218ba1059" exitCode=0 Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.113602 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6blkb" event={"ID":"79a7c128-3c65-491c-95b6-52337183df64","Type":"ContainerDied","Data":"519627fa84fe75bca558edb39f2f6428755a1f9c4bd70537fc1675d218ba1059"} Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.115044 4965 generic.go:334] "Generic (PLEG): container finished" podID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerID="9fd377b309a2407202a91c7b56c3be8010cf1afe85c1deaa6821aa69f5199dab" exitCode=0 Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.115066 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndvs4" event={"ID":"cd34d943-ba97-48d9-bb26-cfa6d0ec549b","Type":"ContainerDied","Data":"9fd377b309a2407202a91c7b56c3be8010cf1afe85c1deaa6821aa69f5199dab"} Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.165292 4965 scope.go:117] "RemoveContainer" containerID="6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.192102 4965 scope.go:117] "RemoveContainer" containerID="c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.203668 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-catalog-content\") pod \"1269cb10-777f-46e4-a52f-8088e7af6b2d\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.203822 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-utilities\") pod \"1269cb10-777f-46e4-a52f-8088e7af6b2d\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.204042 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvmjs\" (UniqueName: \"kubernetes.io/projected/1269cb10-777f-46e4-a52f-8088e7af6b2d-kube-api-access-nvmjs\") pod \"1269cb10-777f-46e4-a52f-8088e7af6b2d\" (UID: \"1269cb10-777f-46e4-a52f-8088e7af6b2d\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.204989 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-utilities" (OuterVolumeSpecName: "utilities") pod "1269cb10-777f-46e4-a52f-8088e7af6b2d" (UID: "1269cb10-777f-46e4-a52f-8088e7af6b2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.211826 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1269cb10-777f-46e4-a52f-8088e7af6b2d-kube-api-access-nvmjs" (OuterVolumeSpecName: "kube-api-access-nvmjs") pod "1269cb10-777f-46e4-a52f-8088e7af6b2d" (UID: "1269cb10-777f-46e4-a52f-8088e7af6b2d"). InnerVolumeSpecName "kube-api-access-nvmjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.214081 4965 scope.go:117] "RemoveContainer" containerID="cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456" Nov 25 15:10:05 crc kubenswrapper[4965]: E1125 15:10:05.217141 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456\": container with ID starting with cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456 not found: ID does not exist" containerID="cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.217187 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456"} err="failed to get container status \"cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456\": rpc error: code = NotFound desc = could not find container \"cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456\": container with ID starting with cb1ab79b2aed2d7d3cfedcdfd3203aa2ce0e313689aae4215363bde6ce1d0456 not found: ID does not exist" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.217238 4965 scope.go:117] "RemoveContainer" containerID="6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7" Nov 25 15:10:05 crc kubenswrapper[4965]: E1125 15:10:05.218332 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7\": container with ID starting with 6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7 not found: ID does not exist" containerID="6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.218363 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7"} err="failed to get container status \"6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7\": rpc error: code = NotFound desc = could not find container \"6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7\": container with ID starting with 6ec357da03ded1663b769106efe093ef75a605960a926dc0bc15f72ccd3c28a7 not found: ID does not exist" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.218385 4965 scope.go:117] "RemoveContainer" containerID="c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d" Nov 25 15:10:05 crc kubenswrapper[4965]: E1125 15:10:05.220266 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d\": container with ID starting with c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d not found: ID does not exist" containerID="c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.220286 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d"} err="failed to get container status \"c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d\": rpc error: code = NotFound desc = could not find container \"c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d\": container with ID starting with c8f601adc540161800a70b36367f418f3ce24931d7afbceebddc7700c8d9dc8d not found: ID does not exist" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.267785 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1269cb10-777f-46e4-a52f-8088e7af6b2d" (UID: "1269cb10-777f-46e4-a52f-8088e7af6b2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.305695 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvmjs\" (UniqueName: \"kubernetes.io/projected/1269cb10-777f-46e4-a52f-8088e7af6b2d-kube-api-access-nvmjs\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.305727 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.305736 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1269cb10-777f-46e4-a52f-8088e7af6b2d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.367204 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.372265 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:10:05 crc kubenswrapper[4965]: E1125 15:10:05.460007 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928 is running failed: container process not found" containerID="8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:10:05 crc kubenswrapper[4965]: E1125 15:10:05.460225 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928 is running failed: container process not found" containerID="8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:10:05 crc kubenswrapper[4965]: E1125 15:10:05.460379 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928 is running failed: container process not found" containerID="8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:10:05 crc kubenswrapper[4965]: E1125 15:10:05.460400 4965 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-j8f94" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerName="registry-server" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.485709 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9jt9"] Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.485756 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s9jt9"] Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.508327 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.518659 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-trusted-ca\") pod \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.518910 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpjkd\" (UniqueName: \"kubernetes.io/projected/79a7c128-3c65-491c-95b6-52337183df64-kube-api-access-xpjkd\") pod \"79a7c128-3c65-491c-95b6-52337183df64\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.518938 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-operator-metrics\") pod \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.518957 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr852\" (UniqueName: \"kubernetes.io/projected/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-kube-api-access-dr852\") pod \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\" (UID: \"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.518992 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-utilities\") pod \"79a7c128-3c65-491c-95b6-52337183df64\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.519087 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-catalog-content\") pod \"79a7c128-3c65-491c-95b6-52337183df64\" (UID: \"79a7c128-3c65-491c-95b6-52337183df64\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.520149 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" (UID: "2f1d4843-3e5c-4c43-89d8-73271b2f3cf1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.521605 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-utilities" (OuterVolumeSpecName: "utilities") pod "79a7c128-3c65-491c-95b6-52337183df64" (UID: "79a7c128-3c65-491c-95b6-52337183df64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.524493 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-kube-api-access-dr852" (OuterVolumeSpecName: "kube-api-access-dr852") pod "2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" (UID: "2f1d4843-3e5c-4c43-89d8-73271b2f3cf1"). InnerVolumeSpecName "kube-api-access-dr852". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.531644 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a7c128-3c65-491c-95b6-52337183df64-kube-api-access-xpjkd" (OuterVolumeSpecName: "kube-api-access-xpjkd") pod "79a7c128-3c65-491c-95b6-52337183df64" (UID: "79a7c128-3c65-491c-95b6-52337183df64"). InnerVolumeSpecName "kube-api-access-xpjkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.541839 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" (UID: "2f1d4843-3e5c-4c43-89d8-73271b2f3cf1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.548466 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.608266 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-52qw9"] Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.620911 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wwgp\" (UniqueName: \"kubernetes.io/projected/2fccc0df-85ec-4aeb-9217-00c37ea16e67-kube-api-access-2wwgp\") pod \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.621262 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-utilities\") pod \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.621315 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-catalog-content\") pod \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\" (UID: \"2fccc0df-85ec-4aeb-9217-00c37ea16e67\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.621553 4965 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.621571 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpjkd\" (UniqueName: \"kubernetes.io/projected/79a7c128-3c65-491c-95b6-52337183df64-kube-api-access-xpjkd\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.621583 4965 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.621593 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr852\" (UniqueName: \"kubernetes.io/projected/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1-kube-api-access-dr852\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.621603 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.620956 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79a7c128-3c65-491c-95b6-52337183df64" (UID: "79a7c128-3c65-491c-95b6-52337183df64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.623873 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fccc0df-85ec-4aeb-9217-00c37ea16e67-kube-api-access-2wwgp" (OuterVolumeSpecName: "kube-api-access-2wwgp") pod "2fccc0df-85ec-4aeb-9217-00c37ea16e67" (UID: "2fccc0df-85ec-4aeb-9217-00c37ea16e67"). InnerVolumeSpecName "kube-api-access-2wwgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.625400 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-utilities" (OuterVolumeSpecName: "utilities") pod "2fccc0df-85ec-4aeb-9217-00c37ea16e67" (UID: "2fccc0df-85ec-4aeb-9217-00c37ea16e67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.722741 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-utilities\") pod \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.722893 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs9tv\" (UniqueName: \"kubernetes.io/projected/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-kube-api-access-xs9tv\") pod \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.722925 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-catalog-content\") pod \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\" (UID: \"cd34d943-ba97-48d9-bb26-cfa6d0ec549b\") " Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.723352 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a7c128-3c65-491c-95b6-52337183df64-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.723381 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wwgp\" (UniqueName: \"kubernetes.io/projected/2fccc0df-85ec-4aeb-9217-00c37ea16e67-kube-api-access-2wwgp\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.723396 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.723614 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-utilities" (OuterVolumeSpecName: "utilities") pod "cd34d943-ba97-48d9-bb26-cfa6d0ec549b" (UID: "cd34d943-ba97-48d9-bb26-cfa6d0ec549b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.727556 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-kube-api-access-xs9tv" (OuterVolumeSpecName: "kube-api-access-xs9tv") pod "cd34d943-ba97-48d9-bb26-cfa6d0ec549b" (UID: "cd34d943-ba97-48d9-bb26-cfa6d0ec549b"). InnerVolumeSpecName "kube-api-access-xs9tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.747842 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd34d943-ba97-48d9-bb26-cfa6d0ec549b" (UID: "cd34d943-ba97-48d9-bb26-cfa6d0ec549b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.748762 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fccc0df-85ec-4aeb-9217-00c37ea16e67" (UID: "2fccc0df-85ec-4aeb-9217-00c37ea16e67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.825462 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs9tv\" (UniqueName: \"kubernetes.io/projected/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-kube-api-access-xs9tv\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.825537 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.825554 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd34d943-ba97-48d9-bb26-cfa6d0ec549b-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:05 crc kubenswrapper[4965]: I1125 15:10:05.825596 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fccc0df-85ec-4aeb-9217-00c37ea16e67-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.121328 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8f94" event={"ID":"2fccc0df-85ec-4aeb-9217-00c37ea16e67","Type":"ContainerDied","Data":"49e3567a71ee49846a9bd218bf24649ac4e972398eba162b21a11437df0488e3"} Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.121385 4965 scope.go:117] "RemoveContainer" containerID="8215130bdedc87c18a3489a684d2aa268718a04f5caf7deae4f6e060cd9c2928" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.121573 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8f94" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.123575 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6blkb" event={"ID":"79a7c128-3c65-491c-95b6-52337183df64","Type":"ContainerDied","Data":"4f68fbdac800b963148815bbba9df5458f6c9ff77a7c027793e18d520d0c86bb"} Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.123766 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6blkb" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.125451 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndvs4" event={"ID":"cd34d943-ba97-48d9-bb26-cfa6d0ec549b","Type":"ContainerDied","Data":"901f47277456c092ddd972267f7d23450433ac248fa4af25079317ee048ae0c8"} Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.125566 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndvs4" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.128014 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" event={"ID":"86b81218-a04c-44e9-b4bc-efa18ee58d7e","Type":"ContainerStarted","Data":"d66afe643871bc256cf02751735334709d9ca385dd9c0ad14cc39315dab77f87"} Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.128058 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" event={"ID":"86b81218-a04c-44e9-b4bc-efa18ee58d7e","Type":"ContainerStarted","Data":"cf0896c825b691155e156bb5df116d4e1e8ec4705c3dc114ccd7ec0ecf3f9c8e"} Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.128335 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.131850 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.133303 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" event={"ID":"2f1d4843-3e5c-4c43-89d8-73271b2f3cf1","Type":"ContainerDied","Data":"224577b33633e16adf2d5e2bf418384650794cbe4d5d0ba3d38ecab52a81bf32"} Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.133343 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nld56" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.169272 4965 scope.go:117] "RemoveContainer" containerID="fc5208ff388cc5be0685cc6826bf2fc28fa93337d185fc4750b6819994cb1687" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.169257 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-52qw9" podStartSLOduration=2.169220464 podStartE2EDuration="2.169220464s" podCreationTimestamp="2025-11-25 15:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:10:06.161252344 +0000 UTC m=+351.128846090" watchObservedRunningTime="2025-11-25 15:10:06.169220464 +0000 UTC m=+351.136814210" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.195804 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndvs4"] Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.200541 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndvs4"] Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.205513 4965 scope.go:117] "RemoveContainer" containerID="5ce18658d64d39d4f5d01dddf5a1acf245e7e956b136e9d044187290a5d4499f" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.214286 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j8f94"] Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.216804 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j8f94"] Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.224248 4965 scope.go:117] "RemoveContainer" containerID="519627fa84fe75bca558edb39f2f6428755a1f9c4bd70537fc1675d218ba1059" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.241643 4965 scope.go:117] "RemoveContainer" containerID="9ede19fafee6f4cdb2ebc2de199584e8d5f9b0261394e33451d045f40b3c6e67" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.247207 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nld56"] Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.257864 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nld56"] Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.264988 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6blkb"] Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.268545 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6blkb"] Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.271946 4965 scope.go:117] "RemoveContainer" containerID="59de2e8747e9de7029ff7b949030bb9ce7deeeb86e69e48dad55ca4b62e68484" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.290234 4965 scope.go:117] "RemoveContainer" containerID="9fd377b309a2407202a91c7b56c3be8010cf1afe85c1deaa6821aa69f5199dab" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.306253 4965 scope.go:117] "RemoveContainer" containerID="6bf2466eea8ca0d1c05f8530a35593f71a2ee6143b05dd6414baa1ccdd97e99e" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.320431 4965 scope.go:117] "RemoveContainer" containerID="6f4ae749b08f1505ccb07a1541823399da8b6b6309573d7f4d3c0e562108c09e" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.338696 4965 scope.go:117] "RemoveContainer" containerID="a9480a59fae99887ce186b7a3beca72fb8cfc2292c464ff86c3c9e85447cac18" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.777612 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" path="/var/lib/kubelet/pods/1269cb10-777f-46e4-a52f-8088e7af6b2d/volumes" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.778560 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" path="/var/lib/kubelet/pods/2f1d4843-3e5c-4c43-89d8-73271b2f3cf1/volumes" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.779041 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" path="/var/lib/kubelet/pods/2fccc0df-85ec-4aeb-9217-00c37ea16e67/volumes" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.780106 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a7c128-3c65-491c-95b6-52337183df64" path="/var/lib/kubelet/pods/79a7c128-3c65-491c-95b6-52337183df64/volumes" Nov 25 15:10:06 crc kubenswrapper[4965]: I1125 15:10:06.780743 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" path="/var/lib/kubelet/pods/cd34d943-ba97-48d9-bb26-cfa6d0ec549b/volumes" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.178550 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8587n"] Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.179170 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerName="extract-utilities" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.179335 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerName="extract-utilities" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.179500 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerName="extract-utilities" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.179581 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerName="extract-utilities" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.179638 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.179690 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.179747 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a7c128-3c65-491c-95b6-52337183df64" containerName="extract-utilities" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.179807 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a7c128-3c65-491c-95b6-52337183df64" containerName="extract-utilities" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.179860 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" containerName="marketplace-operator" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.179920 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" containerName="marketplace-operator" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.179996 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerName="extract-utilities" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.180055 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerName="extract-utilities" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.180107 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a7c128-3c65-491c-95b6-52337183df64" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.180172 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a7c128-3c65-491c-95b6-52337183df64" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.180245 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a7c128-3c65-491c-95b6-52337183df64" containerName="extract-content" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.180310 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a7c128-3c65-491c-95b6-52337183df64" containerName="extract-content" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.180364 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerName="extract-content" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.180421 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerName="extract-content" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.180479 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.180559 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.180648 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerName="extract-content" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.180733 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerName="extract-content" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.180821 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerName="extract-content" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.180905 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerName="extract-content" Nov 25 15:10:07 crc kubenswrapper[4965]: E1125 15:10:07.180999 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.181078 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.181273 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fccc0df-85ec-4aeb-9217-00c37ea16e67" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.181380 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="1269cb10-777f-46e4-a52f-8088e7af6b2d" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.181466 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a7c128-3c65-491c-95b6-52337183df64" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.181550 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd34d943-ba97-48d9-bb26-cfa6d0ec549b" containerName="registry-server" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.181633 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1d4843-3e5c-4c43-89d8-73271b2f3cf1" containerName="marketplace-operator" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.182548 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8587n"] Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.182941 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.184975 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.348998 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa0430c6-ead3-4363-aea8-068563e1bdfe-utilities\") pod \"certified-operators-8587n\" (UID: \"fa0430c6-ead3-4363-aea8-068563e1bdfe\") " pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.349095 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa0430c6-ead3-4363-aea8-068563e1bdfe-catalog-content\") pod \"certified-operators-8587n\" (UID: \"fa0430c6-ead3-4363-aea8-068563e1bdfe\") " pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.349168 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9bxk\" (UniqueName: \"kubernetes.io/projected/fa0430c6-ead3-4363-aea8-068563e1bdfe-kube-api-access-s9bxk\") pod \"certified-operators-8587n\" (UID: \"fa0430c6-ead3-4363-aea8-068563e1bdfe\") " pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.450571 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9bxk\" (UniqueName: \"kubernetes.io/projected/fa0430c6-ead3-4363-aea8-068563e1bdfe-kube-api-access-s9bxk\") pod \"certified-operators-8587n\" (UID: \"fa0430c6-ead3-4363-aea8-068563e1bdfe\") " pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.450863 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa0430c6-ead3-4363-aea8-068563e1bdfe-utilities\") pod \"certified-operators-8587n\" (UID: \"fa0430c6-ead3-4363-aea8-068563e1bdfe\") " pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.450980 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa0430c6-ead3-4363-aea8-068563e1bdfe-catalog-content\") pod \"certified-operators-8587n\" (UID: \"fa0430c6-ead3-4363-aea8-068563e1bdfe\") " pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.451462 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa0430c6-ead3-4363-aea8-068563e1bdfe-utilities\") pod \"certified-operators-8587n\" (UID: \"fa0430c6-ead3-4363-aea8-068563e1bdfe\") " pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.451505 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa0430c6-ead3-4363-aea8-068563e1bdfe-catalog-content\") pod \"certified-operators-8587n\" (UID: \"fa0430c6-ead3-4363-aea8-068563e1bdfe\") " pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.468272 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9bxk\" (UniqueName: \"kubernetes.io/projected/fa0430c6-ead3-4363-aea8-068563e1bdfe-kube-api-access-s9bxk\") pod \"certified-operators-8587n\" (UID: \"fa0430c6-ead3-4363-aea8-068563e1bdfe\") " pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.507649 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:07 crc kubenswrapper[4965]: I1125 15:10:07.928816 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8587n"] Nov 25 15:10:07 crc kubenswrapper[4965]: W1125 15:10:07.936372 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa0430c6_ead3_4363_aea8_068563e1bdfe.slice/crio-4d5408f7e72c90360163ab349e89cd6c9da00fcb9c1f9fdeb8186d9289f1bd3f WatchSource:0}: Error finding container 4d5408f7e72c90360163ab349e89cd6c9da00fcb9c1f9fdeb8186d9289f1bd3f: Status 404 returned error can't find the container with id 4d5408f7e72c90360163ab349e89cd6c9da00fcb9c1f9fdeb8186d9289f1bd3f Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.148263 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8587n" event={"ID":"fa0430c6-ead3-4363-aea8-068563e1bdfe","Type":"ContainerStarted","Data":"4d5408f7e72c90360163ab349e89cd6c9da00fcb9c1f9fdeb8186d9289f1bd3f"} Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.575011 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2c9sk"] Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.576244 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.584241 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.587733 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2c9sk"] Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.666281 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e97c3d-45dc-4c17-86f0-06141c4b5b69-utilities\") pod \"redhat-operators-2c9sk\" (UID: \"98e97c3d-45dc-4c17-86f0-06141c4b5b69\") " pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.666372 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e97c3d-45dc-4c17-86f0-06141c4b5b69-catalog-content\") pod \"redhat-operators-2c9sk\" (UID: \"98e97c3d-45dc-4c17-86f0-06141c4b5b69\") " pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.666457 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzmtn\" (UniqueName: \"kubernetes.io/projected/98e97c3d-45dc-4c17-86f0-06141c4b5b69-kube-api-access-jzmtn\") pod \"redhat-operators-2c9sk\" (UID: \"98e97c3d-45dc-4c17-86f0-06141c4b5b69\") " pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.767411 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzmtn\" (UniqueName: \"kubernetes.io/projected/98e97c3d-45dc-4c17-86f0-06141c4b5b69-kube-api-access-jzmtn\") pod \"redhat-operators-2c9sk\" (UID: \"98e97c3d-45dc-4c17-86f0-06141c4b5b69\") " pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.767460 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e97c3d-45dc-4c17-86f0-06141c4b5b69-utilities\") pod \"redhat-operators-2c9sk\" (UID: \"98e97c3d-45dc-4c17-86f0-06141c4b5b69\") " pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.767514 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e97c3d-45dc-4c17-86f0-06141c4b5b69-catalog-content\") pod \"redhat-operators-2c9sk\" (UID: \"98e97c3d-45dc-4c17-86f0-06141c4b5b69\") " pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.768103 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e97c3d-45dc-4c17-86f0-06141c4b5b69-catalog-content\") pod \"redhat-operators-2c9sk\" (UID: \"98e97c3d-45dc-4c17-86f0-06141c4b5b69\") " pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.768434 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e97c3d-45dc-4c17-86f0-06141c4b5b69-utilities\") pod \"redhat-operators-2c9sk\" (UID: \"98e97c3d-45dc-4c17-86f0-06141c4b5b69\") " pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.785393 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzmtn\" (UniqueName: \"kubernetes.io/projected/98e97c3d-45dc-4c17-86f0-06141c4b5b69-kube-api-access-jzmtn\") pod \"redhat-operators-2c9sk\" (UID: \"98e97c3d-45dc-4c17-86f0-06141c4b5b69\") " pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:08 crc kubenswrapper[4965]: I1125 15:10:08.899050 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.154622 4965 generic.go:334] "Generic (PLEG): container finished" podID="fa0430c6-ead3-4363-aea8-068563e1bdfe" containerID="531b5da577c9ecf353770594c601365abf8b6a2f2c29b45ef9053590863df86d" exitCode=0 Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.154678 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8587n" event={"ID":"fa0430c6-ead3-4363-aea8-068563e1bdfe","Type":"ContainerDied","Data":"531b5da577c9ecf353770594c601365abf8b6a2f2c29b45ef9053590863df86d"} Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.286346 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2c9sk"] Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.577654 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6n7bt"] Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.578657 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.586745 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.591111 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6n7bt"] Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.682375 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-utilities\") pod \"community-operators-6n7bt\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.682454 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-catalog-content\") pod \"community-operators-6n7bt\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.682498 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlvtq\" (UniqueName: \"kubernetes.io/projected/42408875-e2d8-4537-85c8-aa2f8fe58cc0-kube-api-access-dlvtq\") pod \"community-operators-6n7bt\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.783655 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-utilities\") pod \"community-operators-6n7bt\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.783713 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-catalog-content\") pod \"community-operators-6n7bt\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.783757 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlvtq\" (UniqueName: \"kubernetes.io/projected/42408875-e2d8-4537-85c8-aa2f8fe58cc0-kube-api-access-dlvtq\") pod \"community-operators-6n7bt\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.785813 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-utilities\") pod \"community-operators-6n7bt\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.788758 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-catalog-content\") pod \"community-operators-6n7bt\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.804861 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlvtq\" (UniqueName: \"kubernetes.io/projected/42408875-e2d8-4537-85c8-aa2f8fe58cc0-kube-api-access-dlvtq\") pod \"community-operators-6n7bt\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:09 crc kubenswrapper[4965]: I1125 15:10:09.900027 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:10 crc kubenswrapper[4965]: I1125 15:10:10.162032 4965 generic.go:334] "Generic (PLEG): container finished" podID="98e97c3d-45dc-4c17-86f0-06141c4b5b69" containerID="069b73ff0fe5d2d7cf5aaedaae43a8c1dbd695ebffe634a58b2f85217e7dd862" exitCode=0 Nov 25 15:10:10 crc kubenswrapper[4965]: I1125 15:10:10.162119 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c9sk" event={"ID":"98e97c3d-45dc-4c17-86f0-06141c4b5b69","Type":"ContainerDied","Data":"069b73ff0fe5d2d7cf5aaedaae43a8c1dbd695ebffe634a58b2f85217e7dd862"} Nov 25 15:10:10 crc kubenswrapper[4965]: I1125 15:10:10.162361 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c9sk" event={"ID":"98e97c3d-45dc-4c17-86f0-06141c4b5b69","Type":"ContainerStarted","Data":"f8c252bbed64f79fde9bcded878f0fed989d37b5b70cb37852e47fd9c7086cbb"} Nov 25 15:10:10 crc kubenswrapper[4965]: I1125 15:10:10.288078 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6n7bt"] Nov 25 15:10:10 crc kubenswrapper[4965]: W1125 15:10:10.291900 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42408875_e2d8_4537_85c8_aa2f8fe58cc0.slice/crio-930ced5038582aded1bb4660fc30f4c2a1692af13300cd441fe36a627838bc95 WatchSource:0}: Error finding container 930ced5038582aded1bb4660fc30f4c2a1692af13300cd441fe36a627838bc95: Status 404 returned error can't find the container with id 930ced5038582aded1bb4660fc30f4c2a1692af13300cd441fe36a627838bc95 Nov 25 15:10:10 crc kubenswrapper[4965]: I1125 15:10:10.975593 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6plmq"] Nov 25 15:10:10 crc kubenswrapper[4965]: I1125 15:10:10.979314 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:10 crc kubenswrapper[4965]: I1125 15:10:10.981557 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 15:10:10 crc kubenswrapper[4965]: I1125 15:10:10.987915 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6plmq"] Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.103311 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6plfc\" (UniqueName: \"kubernetes.io/projected/b439de70-154c-4dbf-99b2-4cc5e9f03996-kube-api-access-6plfc\") pod \"redhat-marketplace-6plmq\" (UID: \"b439de70-154c-4dbf-99b2-4cc5e9f03996\") " pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.103384 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b439de70-154c-4dbf-99b2-4cc5e9f03996-catalog-content\") pod \"redhat-marketplace-6plmq\" (UID: \"b439de70-154c-4dbf-99b2-4cc5e9f03996\") " pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.103427 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b439de70-154c-4dbf-99b2-4cc5e9f03996-utilities\") pod \"redhat-marketplace-6plmq\" (UID: \"b439de70-154c-4dbf-99b2-4cc5e9f03996\") " pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.169886 4965 generic.go:334] "Generic (PLEG): container finished" podID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerID="17e7f2030e6d2bbe23edc5e3740712906e6749b8e21641d1727e480e5d94b5fd" exitCode=0 Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.170002 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7bt" event={"ID":"42408875-e2d8-4537-85c8-aa2f8fe58cc0","Type":"ContainerDied","Data":"17e7f2030e6d2bbe23edc5e3740712906e6749b8e21641d1727e480e5d94b5fd"} Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.170031 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7bt" event={"ID":"42408875-e2d8-4537-85c8-aa2f8fe58cc0","Type":"ContainerStarted","Data":"930ced5038582aded1bb4660fc30f4c2a1692af13300cd441fe36a627838bc95"} Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.172875 4965 generic.go:334] "Generic (PLEG): container finished" podID="fa0430c6-ead3-4363-aea8-068563e1bdfe" containerID="d5155cb0dc20b17ff8fbcc190b9138163f072d6e09f9f05e11c784b47409f315" exitCode=0 Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.172903 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8587n" event={"ID":"fa0430c6-ead3-4363-aea8-068563e1bdfe","Type":"ContainerDied","Data":"d5155cb0dc20b17ff8fbcc190b9138163f072d6e09f9f05e11c784b47409f315"} Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.206318 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b439de70-154c-4dbf-99b2-4cc5e9f03996-catalog-content\") pod \"redhat-marketplace-6plmq\" (UID: \"b439de70-154c-4dbf-99b2-4cc5e9f03996\") " pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.206398 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b439de70-154c-4dbf-99b2-4cc5e9f03996-utilities\") pod \"redhat-marketplace-6plmq\" (UID: \"b439de70-154c-4dbf-99b2-4cc5e9f03996\") " pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.206465 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6plfc\" (UniqueName: \"kubernetes.io/projected/b439de70-154c-4dbf-99b2-4cc5e9f03996-kube-api-access-6plfc\") pod \"redhat-marketplace-6plmq\" (UID: \"b439de70-154c-4dbf-99b2-4cc5e9f03996\") " pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.207158 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b439de70-154c-4dbf-99b2-4cc5e9f03996-catalog-content\") pod \"redhat-marketplace-6plmq\" (UID: \"b439de70-154c-4dbf-99b2-4cc5e9f03996\") " pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.207217 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b439de70-154c-4dbf-99b2-4cc5e9f03996-utilities\") pod \"redhat-marketplace-6plmq\" (UID: \"b439de70-154c-4dbf-99b2-4cc5e9f03996\") " pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.229080 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6plfc\" (UniqueName: \"kubernetes.io/projected/b439de70-154c-4dbf-99b2-4cc5e9f03996-kube-api-access-6plfc\") pod \"redhat-marketplace-6plmq\" (UID: \"b439de70-154c-4dbf-99b2-4cc5e9f03996\") " pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.299497 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:11 crc kubenswrapper[4965]: I1125 15:10:11.860003 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6plmq"] Nov 25 15:10:11 crc kubenswrapper[4965]: W1125 15:10:11.860683 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb439de70_154c_4dbf_99b2_4cc5e9f03996.slice/crio-d2bf9e46ba96a4dc51e339fd67632f53be8cdb2a86283a5bb9b7cc4271f79890 WatchSource:0}: Error finding container d2bf9e46ba96a4dc51e339fd67632f53be8cdb2a86283a5bb9b7cc4271f79890: Status 404 returned error can't find the container with id d2bf9e46ba96a4dc51e339fd67632f53be8cdb2a86283a5bb9b7cc4271f79890 Nov 25 15:10:12 crc kubenswrapper[4965]: I1125 15:10:12.180267 4965 generic.go:334] "Generic (PLEG): container finished" podID="b439de70-154c-4dbf-99b2-4cc5e9f03996" containerID="b40f94bfc2984da84579bb5d085d1b5008df33cb3d5d6f12adeae55c815d46db" exitCode=0 Nov 25 15:10:12 crc kubenswrapper[4965]: I1125 15:10:12.180428 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6plmq" event={"ID":"b439de70-154c-4dbf-99b2-4cc5e9f03996","Type":"ContainerDied","Data":"b40f94bfc2984da84579bb5d085d1b5008df33cb3d5d6f12adeae55c815d46db"} Nov 25 15:10:12 crc kubenswrapper[4965]: I1125 15:10:12.180614 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6plmq" event={"ID":"b439de70-154c-4dbf-99b2-4cc5e9f03996","Type":"ContainerStarted","Data":"d2bf9e46ba96a4dc51e339fd67632f53be8cdb2a86283a5bb9b7cc4271f79890"} Nov 25 15:10:12 crc kubenswrapper[4965]: I1125 15:10:12.183904 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8587n" event={"ID":"fa0430c6-ead3-4363-aea8-068563e1bdfe","Type":"ContainerStarted","Data":"6b3f0b893decdd287637fd347588ffb34ca0dc29c68b251ec50a0bf2b7f89488"} Nov 25 15:10:12 crc kubenswrapper[4965]: I1125 15:10:12.187832 4965 generic.go:334] "Generic (PLEG): container finished" podID="98e97c3d-45dc-4c17-86f0-06141c4b5b69" containerID="203368c5866f96c54704fa426d8088ded0f65faa978def17cdaade84abb3b59f" exitCode=0 Nov 25 15:10:12 crc kubenswrapper[4965]: I1125 15:10:12.187859 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c9sk" event={"ID":"98e97c3d-45dc-4c17-86f0-06141c4b5b69","Type":"ContainerDied","Data":"203368c5866f96c54704fa426d8088ded0f65faa978def17cdaade84abb3b59f"} Nov 25 15:10:12 crc kubenswrapper[4965]: I1125 15:10:12.235447 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8587n" podStartSLOduration=2.498898821 podStartE2EDuration="5.235429164s" podCreationTimestamp="2025-11-25 15:10:07 +0000 UTC" firstStartedPulling="2025-11-25 15:10:09.15641705 +0000 UTC m=+354.124010796" lastFinishedPulling="2025-11-25 15:10:11.892947393 +0000 UTC m=+356.860541139" observedRunningTime="2025-11-25 15:10:12.21462529 +0000 UTC m=+357.182219036" watchObservedRunningTime="2025-11-25 15:10:12.235429164 +0000 UTC m=+357.203022910" Nov 25 15:10:12 crc kubenswrapper[4965]: I1125 15:10:12.477542 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-6t6zq" Nov 25 15:10:12 crc kubenswrapper[4965]: I1125 15:10:12.532716 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22g9m"] Nov 25 15:10:13 crc kubenswrapper[4965]: I1125 15:10:13.193346 4965 generic.go:334] "Generic (PLEG): container finished" podID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerID="c29131e8d88f0cb759c0f5602dcf3106a9c78a003e8bd3f1967c291992e314bf" exitCode=0 Nov 25 15:10:13 crc kubenswrapper[4965]: I1125 15:10:13.193441 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7bt" event={"ID":"42408875-e2d8-4537-85c8-aa2f8fe58cc0","Type":"ContainerDied","Data":"c29131e8d88f0cb759c0f5602dcf3106a9c78a003e8bd3f1967c291992e314bf"} Nov 25 15:10:16 crc kubenswrapper[4965]: I1125 15:10:16.212459 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c9sk" event={"ID":"98e97c3d-45dc-4c17-86f0-06141c4b5b69","Type":"ContainerStarted","Data":"18ce43ac819a7d566cb629a8c2141eb0d0f7d171f92c84227495d6e56769d279"} Nov 25 15:10:16 crc kubenswrapper[4965]: I1125 15:10:16.230671 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2c9sk" podStartSLOduration=4.906055671 podStartE2EDuration="8.230652869s" podCreationTimestamp="2025-11-25 15:10:08 +0000 UTC" firstStartedPulling="2025-11-25 15:10:10.328658339 +0000 UTC m=+355.296252085" lastFinishedPulling="2025-11-25 15:10:13.653255537 +0000 UTC m=+358.620849283" observedRunningTime="2025-11-25 15:10:16.227046629 +0000 UTC m=+361.194640385" watchObservedRunningTime="2025-11-25 15:10:16.230652869 +0000 UTC m=+361.198246615" Nov 25 15:10:17 crc kubenswrapper[4965]: I1125 15:10:17.507906 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:17 crc kubenswrapper[4965]: I1125 15:10:17.509036 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:17 crc kubenswrapper[4965]: I1125 15:10:17.571438 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:18 crc kubenswrapper[4965]: I1125 15:10:18.259263 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8587n" Nov 25 15:10:18 crc kubenswrapper[4965]: I1125 15:10:18.900343 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:18 crc kubenswrapper[4965]: I1125 15:10:18.901498 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:19 crc kubenswrapper[4965]: I1125 15:10:19.955839 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2c9sk" podUID="98e97c3d-45dc-4c17-86f0-06141c4b5b69" containerName="registry-server" probeResult="failure" output=< Nov 25 15:10:19 crc kubenswrapper[4965]: timeout: failed to connect service ":50051" within 1s Nov 25 15:10:19 crc kubenswrapper[4965]: > Nov 25 15:10:23 crc kubenswrapper[4965]: I1125 15:10:23.260311 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:10:23 crc kubenswrapper[4965]: I1125 15:10:23.260645 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:10:28 crc kubenswrapper[4965]: I1125 15:10:28.945532 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:28 crc kubenswrapper[4965]: I1125 15:10:28.998644 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2c9sk" Nov 25 15:10:29 crc kubenswrapper[4965]: I1125 15:10:29.273744 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6plmq" event={"ID":"b439de70-154c-4dbf-99b2-4cc5e9f03996","Type":"ContainerStarted","Data":"c299b8b8104e950f747581dafc57e0163135ae6beaf6aa249a755e6b16b0c628"} Nov 25 15:10:30 crc kubenswrapper[4965]: I1125 15:10:30.279854 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7bt" event={"ID":"42408875-e2d8-4537-85c8-aa2f8fe58cc0","Type":"ContainerStarted","Data":"cc76d35d55b940b2deb1af0e8a8c6a9d2a2eead59cef8e294e43e5290bee8048"} Nov 25 15:10:30 crc kubenswrapper[4965]: I1125 15:10:30.281179 4965 generic.go:334] "Generic (PLEG): container finished" podID="b439de70-154c-4dbf-99b2-4cc5e9f03996" containerID="c299b8b8104e950f747581dafc57e0163135ae6beaf6aa249a755e6b16b0c628" exitCode=0 Nov 25 15:10:30 crc kubenswrapper[4965]: I1125 15:10:30.281203 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6plmq" event={"ID":"b439de70-154c-4dbf-99b2-4cc5e9f03996","Type":"ContainerDied","Data":"c299b8b8104e950f747581dafc57e0163135ae6beaf6aa249a755e6b16b0c628"} Nov 25 15:10:30 crc kubenswrapper[4965]: I1125 15:10:30.305360 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6n7bt" podStartSLOduration=4.010630443 podStartE2EDuration="21.305337116s" podCreationTimestamp="2025-11-25 15:10:09 +0000 UTC" firstStartedPulling="2025-11-25 15:10:11.171126329 +0000 UTC m=+356.138720075" lastFinishedPulling="2025-11-25 15:10:28.465833002 +0000 UTC m=+373.433426748" observedRunningTime="2025-11-25 15:10:30.304298968 +0000 UTC m=+375.271892714" watchObservedRunningTime="2025-11-25 15:10:30.305337116 +0000 UTC m=+375.272930862" Nov 25 15:10:32 crc kubenswrapper[4965]: I1125 15:10:32.197171 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z"] Nov 25 15:10:32 crc kubenswrapper[4965]: I1125 15:10:32.197648 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" podUID="5bafdade-fd9c-4ada-9263-a30532d21c32" containerName="route-controller-manager" containerID="cri-o://74f5f8c2fd49cb6b7b4180585a02ab93122e1dc20b02bb9811164153bacfe218" gracePeriod=30 Nov 25 15:10:32 crc kubenswrapper[4965]: I1125 15:10:32.291623 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6plmq" event={"ID":"b439de70-154c-4dbf-99b2-4cc5e9f03996","Type":"ContainerStarted","Data":"deebd82b671ce332f126b7cef6bb482717bb8f53d37c9e6d0d2584953a28544a"} Nov 25 15:10:32 crc kubenswrapper[4965]: I1125 15:10:32.312538 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6plmq" podStartSLOduration=2.642842472 podStartE2EDuration="22.312518878s" podCreationTimestamp="2025-11-25 15:10:10 +0000 UTC" firstStartedPulling="2025-11-25 15:10:12.265755053 +0000 UTC m=+357.233348809" lastFinishedPulling="2025-11-25 15:10:31.935431459 +0000 UTC m=+376.903025215" observedRunningTime="2025-11-25 15:10:32.31039912 +0000 UTC m=+377.277992886" watchObservedRunningTime="2025-11-25 15:10:32.312518878 +0000 UTC m=+377.280112624" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.297913 4965 generic.go:334] "Generic (PLEG): container finished" podID="5bafdade-fd9c-4ada-9263-a30532d21c32" containerID="74f5f8c2fd49cb6b7b4180585a02ab93122e1dc20b02bb9811164153bacfe218" exitCode=0 Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.298022 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" event={"ID":"5bafdade-fd9c-4ada-9263-a30532d21c32","Type":"ContainerDied","Data":"74f5f8c2fd49cb6b7b4180585a02ab93122e1dc20b02bb9811164153bacfe218"} Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.671239 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.705364 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg"] Nov 25 15:10:33 crc kubenswrapper[4965]: E1125 15:10:33.705555 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bafdade-fd9c-4ada-9263-a30532d21c32" containerName="route-controller-manager" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.705566 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bafdade-fd9c-4ada-9263-a30532d21c32" containerName="route-controller-manager" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.705670 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bafdade-fd9c-4ada-9263-a30532d21c32" containerName="route-controller-manager" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.706057 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.718975 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg"] Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.857874 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2lr6\" (UniqueName: \"kubernetes.io/projected/5bafdade-fd9c-4ada-9263-a30532d21c32-kube-api-access-m2lr6\") pod \"5bafdade-fd9c-4ada-9263-a30532d21c32\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.857931 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-client-ca\") pod \"5bafdade-fd9c-4ada-9263-a30532d21c32\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.858015 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bafdade-fd9c-4ada-9263-a30532d21c32-serving-cert\") pod \"5bafdade-fd9c-4ada-9263-a30532d21c32\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.858088 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-config\") pod \"5bafdade-fd9c-4ada-9263-a30532d21c32\" (UID: \"5bafdade-fd9c-4ada-9263-a30532d21c32\") " Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.858307 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/407934c3-05c4-4108-81cd-1454eb614a7f-serving-cert\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.858331 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/407934c3-05c4-4108-81cd-1454eb614a7f-config\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.858386 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m62ns\" (UniqueName: \"kubernetes.io/projected/407934c3-05c4-4108-81cd-1454eb614a7f-kube-api-access-m62ns\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.858405 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/407934c3-05c4-4108-81cd-1454eb614a7f-client-ca\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.858849 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-config" (OuterVolumeSpecName: "config") pod "5bafdade-fd9c-4ada-9263-a30532d21c32" (UID: "5bafdade-fd9c-4ada-9263-a30532d21c32"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.858959 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-client-ca" (OuterVolumeSpecName: "client-ca") pod "5bafdade-fd9c-4ada-9263-a30532d21c32" (UID: "5bafdade-fd9c-4ada-9263-a30532d21c32"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.863710 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bafdade-fd9c-4ada-9263-a30532d21c32-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5bafdade-fd9c-4ada-9263-a30532d21c32" (UID: "5bafdade-fd9c-4ada-9263-a30532d21c32"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.863732 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bafdade-fd9c-4ada-9263-a30532d21c32-kube-api-access-m2lr6" (OuterVolumeSpecName: "kube-api-access-m2lr6") pod "5bafdade-fd9c-4ada-9263-a30532d21c32" (UID: "5bafdade-fd9c-4ada-9263-a30532d21c32"). InnerVolumeSpecName "kube-api-access-m2lr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.959475 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/407934c3-05c4-4108-81cd-1454eb614a7f-serving-cert\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.959518 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/407934c3-05c4-4108-81cd-1454eb614a7f-config\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.959550 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m62ns\" (UniqueName: \"kubernetes.io/projected/407934c3-05c4-4108-81cd-1454eb614a7f-kube-api-access-m62ns\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.959570 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/407934c3-05c4-4108-81cd-1454eb614a7f-client-ca\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.959644 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.959655 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2lr6\" (UniqueName: \"kubernetes.io/projected/5bafdade-fd9c-4ada-9263-a30532d21c32-kube-api-access-m2lr6\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.959666 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bafdade-fd9c-4ada-9263-a30532d21c32-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.959674 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bafdade-fd9c-4ada-9263-a30532d21c32-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.960591 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/407934c3-05c4-4108-81cd-1454eb614a7f-client-ca\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.960796 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/407934c3-05c4-4108-81cd-1454eb614a7f-config\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.962856 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/407934c3-05c4-4108-81cd-1454eb614a7f-serving-cert\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:33 crc kubenswrapper[4965]: I1125 15:10:33.975365 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m62ns\" (UniqueName: \"kubernetes.io/projected/407934c3-05c4-4108-81cd-1454eb614a7f-kube-api-access-m62ns\") pod \"route-controller-manager-99cc7585b-tjrvg\" (UID: \"407934c3-05c4-4108-81cd-1454eb614a7f\") " pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:34 crc kubenswrapper[4965]: I1125 15:10:34.025264 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:34 crc kubenswrapper[4965]: I1125 15:10:34.304037 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" event={"ID":"5bafdade-fd9c-4ada-9263-a30532d21c32","Type":"ContainerDied","Data":"9c0d15952094489101445da2cdf88700efb147b9c4d0e929cc0e705c4ffeac83"} Nov 25 15:10:34 crc kubenswrapper[4965]: I1125 15:10:34.304323 4965 scope.go:117] "RemoveContainer" containerID="74f5f8c2fd49cb6b7b4180585a02ab93122e1dc20b02bb9811164153bacfe218" Nov 25 15:10:34 crc kubenswrapper[4965]: I1125 15:10:34.304086 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z" Nov 25 15:10:34 crc kubenswrapper[4965]: I1125 15:10:34.330177 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z"] Nov 25 15:10:34 crc kubenswrapper[4965]: I1125 15:10:34.339628 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657648d9fc-9lg2z"] Nov 25 15:10:34 crc kubenswrapper[4965]: I1125 15:10:34.451603 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg"] Nov 25 15:10:34 crc kubenswrapper[4965]: W1125 15:10:34.457046 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod407934c3_05c4_4108_81cd_1454eb614a7f.slice/crio-9496860c411282ddd37ae2863c2129ecc836460ab263428b362f8e726cae9618 WatchSource:0}: Error finding container 9496860c411282ddd37ae2863c2129ecc836460ab263428b362f8e726cae9618: Status 404 returned error can't find the container with id 9496860c411282ddd37ae2863c2129ecc836460ab263428b362f8e726cae9618 Nov 25 15:10:34 crc kubenswrapper[4965]: I1125 15:10:34.786373 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bafdade-fd9c-4ada-9263-a30532d21c32" path="/var/lib/kubelet/pods/5bafdade-fd9c-4ada-9263-a30532d21c32/volumes" Nov 25 15:10:35 crc kubenswrapper[4965]: I1125 15:10:35.310447 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" event={"ID":"407934c3-05c4-4108-81cd-1454eb614a7f","Type":"ContainerStarted","Data":"940025fd2e8dccc15a5b8016e7db316a004e2c3d02bcabba79490a2b6bab8227"} Nov 25 15:10:35 crc kubenswrapper[4965]: I1125 15:10:35.310504 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" event={"ID":"407934c3-05c4-4108-81cd-1454eb614a7f","Type":"ContainerStarted","Data":"9496860c411282ddd37ae2863c2129ecc836460ab263428b362f8e726cae9618"} Nov 25 15:10:35 crc kubenswrapper[4965]: I1125 15:10:35.310888 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:35 crc kubenswrapper[4965]: I1125 15:10:35.315902 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" Nov 25 15:10:35 crc kubenswrapper[4965]: I1125 15:10:35.333322 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-99cc7585b-tjrvg" podStartSLOduration=3.333301633 podStartE2EDuration="3.333301633s" podCreationTimestamp="2025-11-25 15:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:10:35.327584275 +0000 UTC m=+380.295178031" watchObservedRunningTime="2025-11-25 15:10:35.333301633 +0000 UTC m=+380.300895389" Nov 25 15:10:37 crc kubenswrapper[4965]: I1125 15:10:37.568066 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" podUID="8c017b51-468b-4ff4-9524-1e4349a54323" containerName="registry" containerID="cri-o://f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb" gracePeriod=30 Nov 25 15:10:37 crc kubenswrapper[4965]: I1125 15:10:37.955390 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.112276 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8c017b51-468b-4ff4-9524-1e4349a54323\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.112359 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-trusted-ca\") pod \"8c017b51-468b-4ff4-9524-1e4349a54323\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.112389 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdx55\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-kube-api-access-fdx55\") pod \"8c017b51-468b-4ff4-9524-1e4349a54323\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.112413 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-bound-sa-token\") pod \"8c017b51-468b-4ff4-9524-1e4349a54323\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.112482 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8c017b51-468b-4ff4-9524-1e4349a54323-installation-pull-secrets\") pod \"8c017b51-468b-4ff4-9524-1e4349a54323\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.112503 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-registry-tls\") pod \"8c017b51-468b-4ff4-9524-1e4349a54323\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.112530 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8c017b51-468b-4ff4-9524-1e4349a54323-ca-trust-extracted\") pod \"8c017b51-468b-4ff4-9524-1e4349a54323\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.112544 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-registry-certificates\") pod \"8c017b51-468b-4ff4-9524-1e4349a54323\" (UID: \"8c017b51-468b-4ff4-9524-1e4349a54323\") " Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.113441 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8c017b51-468b-4ff4-9524-1e4349a54323" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.113590 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8c017b51-468b-4ff4-9524-1e4349a54323" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.118448 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c017b51-468b-4ff4-9524-1e4349a54323-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8c017b51-468b-4ff4-9524-1e4349a54323" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.118520 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8c017b51-468b-4ff4-9524-1e4349a54323" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.119019 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8c017b51-468b-4ff4-9524-1e4349a54323" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.127830 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-kube-api-access-fdx55" (OuterVolumeSpecName: "kube-api-access-fdx55") pod "8c017b51-468b-4ff4-9524-1e4349a54323" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323"). InnerVolumeSpecName "kube-api-access-fdx55". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.128501 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8c017b51-468b-4ff4-9524-1e4349a54323" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.136115 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c017b51-468b-4ff4-9524-1e4349a54323-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8c017b51-468b-4ff4-9524-1e4349a54323" (UID: "8c017b51-468b-4ff4-9524-1e4349a54323"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.213871 4965 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8c017b51-468b-4ff4-9524-1e4349a54323-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.213912 4965 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.213932 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c017b51-468b-4ff4-9524-1e4349a54323-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.213942 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdx55\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-kube-api-access-fdx55\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.213953 4965 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.213981 4965 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8c017b51-468b-4ff4-9524-1e4349a54323-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.213992 4965 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8c017b51-468b-4ff4-9524-1e4349a54323-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.330450 4965 generic.go:334] "Generic (PLEG): container finished" podID="8c017b51-468b-4ff4-9524-1e4349a54323" containerID="f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb" exitCode=0 Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.330495 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" event={"ID":"8c017b51-468b-4ff4-9524-1e4349a54323","Type":"ContainerDied","Data":"f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb"} Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.330526 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" event={"ID":"8c017b51-468b-4ff4-9524-1e4349a54323","Type":"ContainerDied","Data":"1d79333568b27b0f13491a19fa49c1abd9ad1ade83c0768b4f3490a657121d31"} Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.330543 4965 scope.go:117] "RemoveContainer" containerID="f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.330679 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-22g9m" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.346040 4965 scope.go:117] "RemoveContainer" containerID="f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb" Nov 25 15:10:38 crc kubenswrapper[4965]: E1125 15:10:38.346358 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb\": container with ID starting with f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb not found: ID does not exist" containerID="f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.346399 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb"} err="failed to get container status \"f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb\": rpc error: code = NotFound desc = could not find container \"f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb\": container with ID starting with f6dd0db3651e60d74ca4b8bd728acd2ed544b2fee14ac87e3f102ea6ce9738bb not found: ID does not exist" Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.360326 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22g9m"] Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.367251 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22g9m"] Nov 25 15:10:38 crc kubenswrapper[4965]: I1125 15:10:38.777617 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c017b51-468b-4ff4-9524-1e4349a54323" path="/var/lib/kubelet/pods/8c017b51-468b-4ff4-9524-1e4349a54323/volumes" Nov 25 15:10:39 crc kubenswrapper[4965]: I1125 15:10:39.900691 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:39 crc kubenswrapper[4965]: I1125 15:10:39.901011 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:39 crc kubenswrapper[4965]: I1125 15:10:39.960637 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:40 crc kubenswrapper[4965]: I1125 15:10:40.407752 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:10:41 crc kubenswrapper[4965]: I1125 15:10:41.299703 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:41 crc kubenswrapper[4965]: I1125 15:10:41.300712 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:41 crc kubenswrapper[4965]: I1125 15:10:41.349312 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:41 crc kubenswrapper[4965]: I1125 15:10:41.392189 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6plmq" Nov 25 15:10:53 crc kubenswrapper[4965]: I1125 15:10:53.260434 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:10:53 crc kubenswrapper[4965]: I1125 15:10:53.260757 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:10:53 crc kubenswrapper[4965]: I1125 15:10:53.260803 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:10:53 crc kubenswrapper[4965]: I1125 15:10:53.261379 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5219e353dd210900b2aa5e32adf4b960a9f3443e7b6f3437667737ce403d5782"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:10:53 crc kubenswrapper[4965]: I1125 15:10:53.261438 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://5219e353dd210900b2aa5e32adf4b960a9f3443e7b6f3437667737ce403d5782" gracePeriod=600 Nov 25 15:10:53 crc kubenswrapper[4965]: I1125 15:10:53.421600 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="5219e353dd210900b2aa5e32adf4b960a9f3443e7b6f3437667737ce403d5782" exitCode=0 Nov 25 15:10:53 crc kubenswrapper[4965]: I1125 15:10:53.422306 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"5219e353dd210900b2aa5e32adf4b960a9f3443e7b6f3437667737ce403d5782"} Nov 25 15:10:53 crc kubenswrapper[4965]: I1125 15:10:53.422488 4965 scope.go:117] "RemoveContainer" containerID="64bb8c236af9fc691844f7fb38bead1e733b4ac22c7491b041bb0122fcdc545e" Nov 25 15:10:54 crc kubenswrapper[4965]: I1125 15:10:54.430392 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"d2f7e1551d78a273523d393b87983439995fab8ee94e0896b828f265d468b25a"} Nov 25 15:12:53 crc kubenswrapper[4965]: I1125 15:12:53.260898 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:12:53 crc kubenswrapper[4965]: I1125 15:12:53.262125 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:13:17 crc kubenswrapper[4965]: I1125 15:13:17.052749 4965 scope.go:117] "RemoveContainer" containerID="b051a7237f72dc7af7c4e4be4425ef060da5785a432f83cce24891531394a216" Nov 25 15:13:23 crc kubenswrapper[4965]: I1125 15:13:23.260078 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:13:23 crc kubenswrapper[4965]: I1125 15:13:23.260487 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:13:53 crc kubenswrapper[4965]: I1125 15:13:53.263954 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:13:53 crc kubenswrapper[4965]: I1125 15:13:53.264913 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:13:53 crc kubenswrapper[4965]: I1125 15:13:53.265029 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:13:53 crc kubenswrapper[4965]: I1125 15:13:53.266761 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2f7e1551d78a273523d393b87983439995fab8ee94e0896b828f265d468b25a"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:13:53 crc kubenswrapper[4965]: I1125 15:13:53.266907 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://d2f7e1551d78a273523d393b87983439995fab8ee94e0896b828f265d468b25a" gracePeriod=600 Nov 25 15:13:54 crc kubenswrapper[4965]: I1125 15:13:54.553864 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="d2f7e1551d78a273523d393b87983439995fab8ee94e0896b828f265d468b25a" exitCode=0 Nov 25 15:13:54 crc kubenswrapper[4965]: I1125 15:13:54.554001 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"d2f7e1551d78a273523d393b87983439995fab8ee94e0896b828f265d468b25a"} Nov 25 15:13:54 crc kubenswrapper[4965]: I1125 15:13:54.555282 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"99a3ddb14dfc84a3500a205d74675321fc95f75084490879b200cf9441df58f4"} Nov 25 15:13:54 crc kubenswrapper[4965]: I1125 15:13:54.555339 4965 scope.go:117] "RemoveContainer" containerID="5219e353dd210900b2aa5e32adf4b960a9f3443e7b6f3437667737ce403d5782" Nov 25 15:14:17 crc kubenswrapper[4965]: I1125 15:14:17.090135 4965 scope.go:117] "RemoveContainer" containerID="2f1eeae762f660161d50f777adf757128f6476ed38191443e3367c46659183de" Nov 25 15:14:17 crc kubenswrapper[4965]: I1125 15:14:17.110773 4965 scope.go:117] "RemoveContainer" containerID="f1f9b22925286490d960cc36b65af873cab76e846d20482426e9c7ace788718e" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.185939 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8"] Nov 25 15:15:00 crc kubenswrapper[4965]: E1125 15:15:00.186545 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c017b51-468b-4ff4-9524-1e4349a54323" containerName="registry" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.186557 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c017b51-468b-4ff4-9524-1e4349a54323" containerName="registry" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.186660 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c017b51-468b-4ff4-9524-1e4349a54323" containerName="registry" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.187031 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.189789 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.189882 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.196590 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8"] Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.325024 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x92ls\" (UniqueName: \"kubernetes.io/projected/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-kube-api-access-x92ls\") pod \"collect-profiles-29401395-4krd8\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.325128 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-secret-volume\") pod \"collect-profiles-29401395-4krd8\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.325378 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-config-volume\") pod \"collect-profiles-29401395-4krd8\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.426916 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-config-volume\") pod \"collect-profiles-29401395-4krd8\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.427044 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x92ls\" (UniqueName: \"kubernetes.io/projected/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-kube-api-access-x92ls\") pod \"collect-profiles-29401395-4krd8\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.427157 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-secret-volume\") pod \"collect-profiles-29401395-4krd8\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.428949 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-config-volume\") pod \"collect-profiles-29401395-4krd8\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.436907 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-secret-volume\") pod \"collect-profiles-29401395-4krd8\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.447249 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x92ls\" (UniqueName: \"kubernetes.io/projected/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-kube-api-access-x92ls\") pod \"collect-profiles-29401395-4krd8\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.502705 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.709332 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8"] Nov 25 15:15:00 crc kubenswrapper[4965]: I1125 15:15:00.981266 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" event={"ID":"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3","Type":"ContainerStarted","Data":"0232b7c5764a33925ec995dd95ad07e22b93c36090cc70f837ecd86c5d66d81f"} Nov 25 15:15:01 crc kubenswrapper[4965]: I1125 15:15:01.989957 4965 generic.go:334] "Generic (PLEG): container finished" podID="1c26fd7d-aee2-4ebf-80ed-974ebf0427c3" containerID="fcfdc63def900ff4a4b94f4bb3a89563393c69741e8c0a71874ebdaccdfb0f85" exitCode=0 Nov 25 15:15:01 crc kubenswrapper[4965]: I1125 15:15:01.990129 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" event={"ID":"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3","Type":"ContainerDied","Data":"fcfdc63def900ff4a4b94f4bb3a89563393c69741e8c0a71874ebdaccdfb0f85"} Nov 25 15:15:03 crc kubenswrapper[4965]: I1125 15:15:03.268403 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:15:03 crc kubenswrapper[4965]: I1125 15:15:03.361347 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x92ls\" (UniqueName: \"kubernetes.io/projected/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-kube-api-access-x92ls\") pod \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " Nov 25 15:15:03 crc kubenswrapper[4965]: I1125 15:15:03.361582 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-config-volume\") pod \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " Nov 25 15:15:03 crc kubenswrapper[4965]: I1125 15:15:03.361620 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-secret-volume\") pod \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\" (UID: \"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3\") " Nov 25 15:15:03 crc kubenswrapper[4965]: I1125 15:15:03.363615 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-config-volume" (OuterVolumeSpecName: "config-volume") pod "1c26fd7d-aee2-4ebf-80ed-974ebf0427c3" (UID: "1c26fd7d-aee2-4ebf-80ed-974ebf0427c3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:15:03 crc kubenswrapper[4965]: I1125 15:15:03.369022 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1c26fd7d-aee2-4ebf-80ed-974ebf0427c3" (UID: "1c26fd7d-aee2-4ebf-80ed-974ebf0427c3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:15:03 crc kubenswrapper[4965]: I1125 15:15:03.369599 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-kube-api-access-x92ls" (OuterVolumeSpecName: "kube-api-access-x92ls") pod "1c26fd7d-aee2-4ebf-80ed-974ebf0427c3" (UID: "1c26fd7d-aee2-4ebf-80ed-974ebf0427c3"). InnerVolumeSpecName "kube-api-access-x92ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:15:03 crc kubenswrapper[4965]: I1125 15:15:03.463141 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x92ls\" (UniqueName: \"kubernetes.io/projected/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-kube-api-access-x92ls\") on node \"crc\" DevicePath \"\"" Nov 25 15:15:03 crc kubenswrapper[4965]: I1125 15:15:03.463194 4965 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:15:03 crc kubenswrapper[4965]: I1125 15:15:03.463206 4965 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c26fd7d-aee2-4ebf-80ed-974ebf0427c3-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:15:04 crc kubenswrapper[4965]: I1125 15:15:04.006224 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" event={"ID":"1c26fd7d-aee2-4ebf-80ed-974ebf0427c3","Type":"ContainerDied","Data":"0232b7c5764a33925ec995dd95ad07e22b93c36090cc70f837ecd86c5d66d81f"} Nov 25 15:15:04 crc kubenswrapper[4965]: I1125 15:15:04.006285 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0232b7c5764a33925ec995dd95ad07e22b93c36090cc70f837ecd86c5d66d81f" Nov 25 15:15:04 crc kubenswrapper[4965]: I1125 15:15:04.006346 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401395-4krd8" Nov 25 15:16:23 crc kubenswrapper[4965]: I1125 15:16:23.264154 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:16:23 crc kubenswrapper[4965]: I1125 15:16:23.264876 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.254213 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-fwgnb"] Nov 25 15:16:25 crc kubenswrapper[4965]: E1125 15:16:25.256516 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c26fd7d-aee2-4ebf-80ed-974ebf0427c3" containerName="collect-profiles" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.256565 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c26fd7d-aee2-4ebf-80ed-974ebf0427c3" containerName="collect-profiles" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.256694 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c26fd7d-aee2-4ebf-80ed-974ebf0427c3" containerName="collect-profiles" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.257297 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-fwgnb" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.257621 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pwfz\" (UniqueName: \"kubernetes.io/projected/ae69f1e4-8cbc-45b6-bd5a-ddd31ebde275-kube-api-access-2pwfz\") pod \"cert-manager-cainjector-7f985d654d-fwgnb\" (UID: \"ae69f1e4-8cbc-45b6-bd5a-ddd31ebde275\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-fwgnb" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.259255 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.259387 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.265500 4965 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-hjzzj" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.277394 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-fwgnb"] Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.282888 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-f8xjq"] Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.283495 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-f8xjq" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.286128 4965 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-l8q4r" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.299946 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-qrbzh"] Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.300714 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-qrbzh" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.305726 4965 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-89g2n" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.311439 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-f8xjq"] Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.320279 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-qrbzh"] Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.358510 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pwfz\" (UniqueName: \"kubernetes.io/projected/ae69f1e4-8cbc-45b6-bd5a-ddd31ebde275-kube-api-access-2pwfz\") pod \"cert-manager-cainjector-7f985d654d-fwgnb\" (UID: \"ae69f1e4-8cbc-45b6-bd5a-ddd31ebde275\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-fwgnb" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.386744 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pwfz\" (UniqueName: \"kubernetes.io/projected/ae69f1e4-8cbc-45b6-bd5a-ddd31ebde275-kube-api-access-2pwfz\") pod \"cert-manager-cainjector-7f985d654d-fwgnb\" (UID: \"ae69f1e4-8cbc-45b6-bd5a-ddd31ebde275\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-fwgnb" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.460315 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2sw\" (UniqueName: \"kubernetes.io/projected/d8569680-d671-4080-b2ce-ce6e9f858342-kube-api-access-zq2sw\") pod \"cert-manager-5b446d88c5-f8xjq\" (UID: \"d8569680-d671-4080-b2ce-ce6e9f858342\") " pod="cert-manager/cert-manager-5b446d88c5-f8xjq" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.460782 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ktvr\" (UniqueName: \"kubernetes.io/projected/4be52a9a-2bf2-4d74-8c23-ca6825be424a-kube-api-access-8ktvr\") pod \"cert-manager-webhook-5655c58dd6-qrbzh\" (UID: \"4be52a9a-2bf2-4d74-8c23-ca6825be424a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-qrbzh" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.561884 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2sw\" (UniqueName: \"kubernetes.io/projected/d8569680-d671-4080-b2ce-ce6e9f858342-kube-api-access-zq2sw\") pod \"cert-manager-5b446d88c5-f8xjq\" (UID: \"d8569680-d671-4080-b2ce-ce6e9f858342\") " pod="cert-manager/cert-manager-5b446d88c5-f8xjq" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.561997 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ktvr\" (UniqueName: \"kubernetes.io/projected/4be52a9a-2bf2-4d74-8c23-ca6825be424a-kube-api-access-8ktvr\") pod \"cert-manager-webhook-5655c58dd6-qrbzh\" (UID: \"4be52a9a-2bf2-4d74-8c23-ca6825be424a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-qrbzh" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.572399 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-fwgnb" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.586361 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2sw\" (UniqueName: \"kubernetes.io/projected/d8569680-d671-4080-b2ce-ce6e9f858342-kube-api-access-zq2sw\") pod \"cert-manager-5b446d88c5-f8xjq\" (UID: \"d8569680-d671-4080-b2ce-ce6e9f858342\") " pod="cert-manager/cert-manager-5b446d88c5-f8xjq" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.588926 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ktvr\" (UniqueName: \"kubernetes.io/projected/4be52a9a-2bf2-4d74-8c23-ca6825be424a-kube-api-access-8ktvr\") pod \"cert-manager-webhook-5655c58dd6-qrbzh\" (UID: \"4be52a9a-2bf2-4d74-8c23-ca6825be424a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-qrbzh" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.597513 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-f8xjq" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.612352 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-qrbzh" Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.825960 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-fwgnb"] Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.845071 4965 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.880955 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-f8xjq"] Nov 25 15:16:25 crc kubenswrapper[4965]: W1125 15:16:25.885280 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8569680_d671_4080_b2ce_ce6e9f858342.slice/crio-891004dd932942c8acb92f074c347486b64d71d35341f5c4069ced832a7b6825 WatchSource:0}: Error finding container 891004dd932942c8acb92f074c347486b64d71d35341f5c4069ced832a7b6825: Status 404 returned error can't find the container with id 891004dd932942c8acb92f074c347486b64d71d35341f5c4069ced832a7b6825 Nov 25 15:16:25 crc kubenswrapper[4965]: I1125 15:16:25.908915 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-qrbzh"] Nov 25 15:16:25 crc kubenswrapper[4965]: W1125 15:16:25.922541 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4be52a9a_2bf2_4d74_8c23_ca6825be424a.slice/crio-82d3c56f5ead0cb3c5dbdea5a3aac44bd8018ad570dc30050d2ff4aaca4e0b0c WatchSource:0}: Error finding container 82d3c56f5ead0cb3c5dbdea5a3aac44bd8018ad570dc30050d2ff4aaca4e0b0c: Status 404 returned error can't find the container with id 82d3c56f5ead0cb3c5dbdea5a3aac44bd8018ad570dc30050d2ff4aaca4e0b0c Nov 25 15:16:26 crc kubenswrapper[4965]: I1125 15:16:26.519736 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-fwgnb" event={"ID":"ae69f1e4-8cbc-45b6-bd5a-ddd31ebde275","Type":"ContainerStarted","Data":"d95a2f635432ea8ff8d1485d35d8313431122db17284265611929c0c60e15082"} Nov 25 15:16:26 crc kubenswrapper[4965]: I1125 15:16:26.521115 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-qrbzh" event={"ID":"4be52a9a-2bf2-4d74-8c23-ca6825be424a","Type":"ContainerStarted","Data":"82d3c56f5ead0cb3c5dbdea5a3aac44bd8018ad570dc30050d2ff4aaca4e0b0c"} Nov 25 15:16:26 crc kubenswrapper[4965]: I1125 15:16:26.522078 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-f8xjq" event={"ID":"d8569680-d671-4080-b2ce-ce6e9f858342","Type":"ContainerStarted","Data":"891004dd932942c8acb92f074c347486b64d71d35341f5c4069ced832a7b6825"} Nov 25 15:16:35 crc kubenswrapper[4965]: I1125 15:16:35.391277 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-58mtl"] Nov 25 15:16:35 crc kubenswrapper[4965]: I1125 15:16:35.392788 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovn-controller" containerID="cri-o://0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201" gracePeriod=30 Nov 25 15:16:35 crc kubenswrapper[4965]: I1125 15:16:35.393266 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="sbdb" containerID="cri-o://c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a" gracePeriod=30 Nov 25 15:16:35 crc kubenswrapper[4965]: I1125 15:16:35.393317 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="nbdb" containerID="cri-o://817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2" gracePeriod=30 Nov 25 15:16:35 crc kubenswrapper[4965]: I1125 15:16:35.393359 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="northd" containerID="cri-o://0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63" gracePeriod=30 Nov 25 15:16:35 crc kubenswrapper[4965]: I1125 15:16:35.393401 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9" gracePeriod=30 Nov 25 15:16:35 crc kubenswrapper[4965]: I1125 15:16:35.393445 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="kube-rbac-proxy-node" containerID="cri-o://5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11" gracePeriod=30 Nov 25 15:16:35 crc kubenswrapper[4965]: I1125 15:16:35.393484 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovn-acl-logging" containerID="cri-o://103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5" gracePeriod=30 Nov 25 15:16:35 crc kubenswrapper[4965]: I1125 15:16:35.440703 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" containerID="cri-o://edf8a7dc4425022f798e716d332bb0b3e616154dad6e2f975ceed2665e9bcaa0" gracePeriod=30 Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.587335 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovnkube-controller/3.log" Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.592230 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-acl-logging/0.log" Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.593446 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-controller/0.log" Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594480 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="edf8a7dc4425022f798e716d332bb0b3e616154dad6e2f975ceed2665e9bcaa0" exitCode=0 Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594514 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9" exitCode=0 Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594526 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11" exitCode=0 Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594538 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5" exitCode=143 Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594547 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201" exitCode=143 Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594540 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"edf8a7dc4425022f798e716d332bb0b3e616154dad6e2f975ceed2665e9bcaa0"} Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594613 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9"} Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594648 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11"} Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594655 4965 scope.go:117] "RemoveContainer" containerID="767a2d204c66b63932d7c53818e4c5edbac88424af74baaa018dae0e4ac285e1" Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594675 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5"} Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.594803 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201"} Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.597387 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8jdpp_7de2930c-eabd-4919-b214-30b0c83141f7/kube-multus/2.log" Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.600558 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8jdpp_7de2930c-eabd-4919-b214-30b0c83141f7/kube-multus/1.log" Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.600832 4965 generic.go:334] "Generic (PLEG): container finished" podID="7de2930c-eabd-4919-b214-30b0c83141f7" containerID="c96544dec6c115d2b40555ed7271e0566eeeb05c3c57d0c0534d8bcd6583458f" exitCode=2 Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.600934 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8jdpp" event={"ID":"7de2930c-eabd-4919-b214-30b0c83141f7","Type":"ContainerDied","Data":"c96544dec6c115d2b40555ed7271e0566eeeb05c3c57d0c0534d8bcd6583458f"} Nov 25 15:16:36 crc kubenswrapper[4965]: I1125 15:16:36.601959 4965 scope.go:117] "RemoveContainer" containerID="c96544dec6c115d2b40555ed7271e0566eeeb05c3c57d0c0534d8bcd6583458f" Nov 25 15:16:37 crc kubenswrapper[4965]: I1125 15:16:37.612162 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-acl-logging/0.log" Nov 25 15:16:37 crc kubenswrapper[4965]: I1125 15:16:37.612849 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-controller/0.log" Nov 25 15:16:37 crc kubenswrapper[4965]: I1125 15:16:37.613946 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a" exitCode=0 Nov 25 15:16:37 crc kubenswrapper[4965]: I1125 15:16:37.614012 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2" exitCode=0 Nov 25 15:16:37 crc kubenswrapper[4965]: I1125 15:16:37.614025 4965 generic.go:334] "Generic (PLEG): container finished" podID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerID="0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63" exitCode=0 Nov 25 15:16:37 crc kubenswrapper[4965]: I1125 15:16:37.614067 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a"} Nov 25 15:16:37 crc kubenswrapper[4965]: I1125 15:16:37.614105 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2"} Nov 25 15:16:37 crc kubenswrapper[4965]: I1125 15:16:37.614124 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63"} Nov 25 15:16:39 crc kubenswrapper[4965]: E1125 15:16:39.183142 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2 is running failed: container process not found" containerID="817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Nov 25 15:16:39 crc kubenswrapper[4965]: E1125 15:16:39.186310 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2 is running failed: container process not found" containerID="817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Nov 25 15:16:39 crc kubenswrapper[4965]: E1125 15:16:39.186731 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a is running failed: container process not found" containerID="c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Nov 25 15:16:39 crc kubenswrapper[4965]: E1125 15:16:39.188019 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2 is running failed: container process not found" containerID="817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Nov 25 15:16:39 crc kubenswrapper[4965]: E1125 15:16:39.188084 4965 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="nbdb" Nov 25 15:16:39 crc kubenswrapper[4965]: E1125 15:16:39.189812 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a is running failed: container process not found" containerID="c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Nov 25 15:16:39 crc kubenswrapper[4965]: E1125 15:16:39.194077 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a is running failed: container process not found" containerID="c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Nov 25 15:16:39 crc kubenswrapper[4965]: E1125 15:16:39.194297 4965 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="sbdb" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.858624 4965 scope.go:117] "RemoveContainer" containerID="837c67e71328d5e266c11c3d68dbc692d7933b4c05a977346311445482d227ca" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.938465 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-acl-logging/0.log" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.939525 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-controller/0.log" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.940509 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.990896 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-kubelet\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.990955 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-slash\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.990998 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-etc-openvswitch\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991023 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-log-socket\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991050 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-node-log\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991081 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eea3820a-3f97-48a7-8b49-def506fe71e2-ovn-node-metrics-cert\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991112 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mjnz\" (UniqueName: \"kubernetes.io/projected/eea3820a-3f97-48a7-8b49-def506fe71e2-kube-api-access-9mjnz\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991136 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-ovn\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991159 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-systemd-units\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991186 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-config\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991206 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-env-overrides\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991224 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-ovn-kubernetes\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991245 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-systemd\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991268 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-bin\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991292 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-netd\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991313 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-var-lib-openvswitch\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991335 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-netns\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991356 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-openvswitch\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991381 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.991417 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-script-lib\") pod \"eea3820a-3f97-48a7-8b49-def506fe71e2\" (UID: \"eea3820a-3f97-48a7-8b49-def506fe71e2\") " Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992031 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992077 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992053 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992121 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992145 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992169 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-node-log" (OuterVolumeSpecName: "node-log") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992191 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992213 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-slash" (OuterVolumeSpecName: "host-slash") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992233 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992255 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-log-socket" (OuterVolumeSpecName: "log-socket") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992580 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992621 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992827 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.992865 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.993655 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.994714 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:16:39 crc kubenswrapper[4965]: I1125 15:16:39.997994 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.002673 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea3820a-3f97-48a7-8b49-def506fe71e2-kube-api-access-9mjnz" (OuterVolumeSpecName: "kube-api-access-9mjnz") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "kube-api-access-9mjnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.004747 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea3820a-3f97-48a7-8b49-def506fe71e2-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.036230 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "eea3820a-3f97-48a7-8b49-def506fe71e2" (UID: "eea3820a-3f97-48a7-8b49-def506fe71e2"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.036739 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-28sgz"] Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037572 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovn-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037599 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovn-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037611 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovn-acl-logging" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037619 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovn-acl-logging" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037649 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037658 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037674 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037681 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037689 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="kubecfg-setup" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037696 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="kubecfg-setup" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037706 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="kube-rbac-proxy-node" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037712 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="kube-rbac-proxy-node" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037725 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037732 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037743 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="northd" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037749 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="northd" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037768 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="nbdb" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037776 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="nbdb" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037787 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037794 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037802 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="sbdb" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037809 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="sbdb" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.037817 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037824 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037934 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovn-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037944 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="northd" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037953 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.037992 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.038019 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovn-acl-logging" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.038027 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.038034 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="kube-rbac-proxy-node" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.038042 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="nbdb" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.038073 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.038080 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.038090 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="sbdb" Nov 25 15:16:40 crc kubenswrapper[4965]: E1125 15:16:40.038241 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.038249 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.038337 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" containerName="ovnkube-controller" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.039740 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097381 4965 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097408 4965 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097426 4965 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097441 4965 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097451 4965 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097459 4965 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-slash\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097471 4965 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097480 4965 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-log-socket\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097489 4965 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-node-log\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097498 4965 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eea3820a-3f97-48a7-8b49-def506fe71e2-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097510 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mjnz\" (UniqueName: \"kubernetes.io/projected/eea3820a-3f97-48a7-8b49-def506fe71e2-kube-api-access-9mjnz\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097518 4965 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097526 4965 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097537 4965 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097545 4965 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eea3820a-3f97-48a7-8b49-def506fe71e2-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097553 4965 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097562 4965 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097573 4965 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097581 4965 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.097590 4965 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eea3820a-3f97-48a7-8b49-def506fe71e2-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198377 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198433 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-slash\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198450 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-etc-openvswitch\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198466 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f937e187-0b79-400e-8e8f-e47afda9ddf5-env-overrides\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198580 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-run-ovn-kubernetes\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198613 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-run-systemd\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198646 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f937e187-0b79-400e-8e8f-e47afda9ddf5-ovnkube-config\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198692 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-cni-netd\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198722 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nxl4\" (UniqueName: \"kubernetes.io/projected/f937e187-0b79-400e-8e8f-e47afda9ddf5-kube-api-access-8nxl4\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198743 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-systemd-units\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198790 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-log-socket\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198815 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-run-ovn\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198842 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-var-lib-openvswitch\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198879 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f937e187-0b79-400e-8e8f-e47afda9ddf5-ovnkube-script-lib\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198920 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-cni-bin\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198944 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-kubelet\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.198982 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-node-log\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.199005 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f937e187-0b79-400e-8e8f-e47afda9ddf5-ovn-node-metrics-cert\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.199026 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-run-openvswitch\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.199054 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-run-netns\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.299905 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-run-ovn\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.299950 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-log-socket\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.299995 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-var-lib-openvswitch\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300028 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f937e187-0b79-400e-8e8f-e47afda9ddf5-ovnkube-script-lib\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300053 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-cni-bin\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300071 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-kubelet\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300088 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-node-log\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300106 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f937e187-0b79-400e-8e8f-e47afda9ddf5-ovn-node-metrics-cert\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300127 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-run-openvswitch\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300121 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-run-ovn\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300197 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-run-netns\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300206 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-log-socket\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300237 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-kubelet\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300148 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-run-netns\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300266 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-var-lib-openvswitch\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300280 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300317 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-slash\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300339 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-etc-openvswitch\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300360 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f937e187-0b79-400e-8e8f-e47afda9ddf5-env-overrides\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300385 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-run-ovn-kubernetes\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300406 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-run-systemd\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300426 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f937e187-0b79-400e-8e8f-e47afda9ddf5-ovnkube-config\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300447 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-cni-netd\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300467 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nxl4\" (UniqueName: \"kubernetes.io/projected/f937e187-0b79-400e-8e8f-e47afda9ddf5-kube-api-access-8nxl4\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300485 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-systemd-units\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300494 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-cni-bin\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300539 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-systemd-units\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300569 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-node-log\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300604 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-run-ovn-kubernetes\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.300626 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-run-systemd\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.301061 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f937e187-0b79-400e-8e8f-e47afda9ddf5-ovnkube-script-lib\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.301108 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.301157 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f937e187-0b79-400e-8e8f-e47afda9ddf5-env-overrides\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.301212 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-slash\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.301247 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-etc-openvswitch\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.301281 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-host-cni-netd\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.301312 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f937e187-0b79-400e-8e8f-e47afda9ddf5-run-openvswitch\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.301370 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f937e187-0b79-400e-8e8f-e47afda9ddf5-ovnkube-config\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.306129 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f937e187-0b79-400e-8e8f-e47afda9ddf5-ovn-node-metrics-cert\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.318441 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nxl4\" (UniqueName: \"kubernetes.io/projected/f937e187-0b79-400e-8e8f-e47afda9ddf5-kube-api-access-8nxl4\") pod \"ovnkube-node-28sgz\" (UID: \"f937e187-0b79-400e-8e8f-e47afda9ddf5\") " pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.398826 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:16:40 crc kubenswrapper[4965]: W1125 15:16:40.420322 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf937e187_0b79_400e_8e8f_e47afda9ddf5.slice/crio-95823900022c3b195e2cda2e74db276ca9be7c7b5a42571d48375d516a848cd9 WatchSource:0}: Error finding container 95823900022c3b195e2cda2e74db276ca9be7c7b5a42571d48375d516a848cd9: Status 404 returned error can't find the container with id 95823900022c3b195e2cda2e74db276ca9be7c7b5a42571d48375d516a848cd9 Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.632135 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8jdpp_7de2930c-eabd-4919-b214-30b0c83141f7/kube-multus/2.log" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.632566 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8jdpp" event={"ID":"7de2930c-eabd-4919-b214-30b0c83141f7","Type":"ContainerStarted","Data":"4f12d4ec320eda59a6763579680a4233391a2b89b990cb75a8dae37555ed86d6"} Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.637827 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-acl-logging/0.log" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.638346 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-controller/0.log" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.638820 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.638792 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-58mtl" event={"ID":"eea3820a-3f97-48a7-8b49-def506fe71e2","Type":"ContainerDied","Data":"af25b7e7f5d56b7885f8f972888ff2580308857c14133f3483a1e83e2d2684af"} Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.639085 4965 scope.go:117] "RemoveContainer" containerID="edf8a7dc4425022f798e716d332bb0b3e616154dad6e2f975ceed2665e9bcaa0" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.640647 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" event={"ID":"f937e187-0b79-400e-8e8f-e47afda9ddf5","Type":"ContainerStarted","Data":"95823900022c3b195e2cda2e74db276ca9be7c7b5a42571d48375d516a848cd9"} Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.665249 4965 scope.go:117] "RemoveContainer" containerID="c0e92f1e9df05a7b4870b4a0eadf32124eaa3b761a9a086f4ea16a5d58375e1a" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.684687 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-58mtl"] Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.687699 4965 scope.go:117] "RemoveContainer" containerID="817cf72ed95607061eb2cfbdb5093252bee189b8c870362c3400d8ae324d9ad2" Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.696200 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-58mtl"] Nov 25 15:16:40 crc kubenswrapper[4965]: I1125 15:16:40.778826 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea3820a-3f97-48a7-8b49-def506fe71e2" path="/var/lib/kubelet/pods/eea3820a-3f97-48a7-8b49-def506fe71e2/volumes" Nov 25 15:16:41 crc kubenswrapper[4965]: I1125 15:16:41.650885 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-acl-logging/0.log" Nov 25 15:16:41 crc kubenswrapper[4965]: I1125 15:16:41.651889 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-controller/0.log" Nov 25 15:16:41 crc kubenswrapper[4965]: I1125 15:16:41.654238 4965 generic.go:334] "Generic (PLEG): container finished" podID="f937e187-0b79-400e-8e8f-e47afda9ddf5" containerID="9f9eee1b764cbabf1c26602aa0de478e6350d96f24249496e8f1211b0e7e8307" exitCode=0 Nov 25 15:16:41 crc kubenswrapper[4965]: I1125 15:16:41.654281 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" event={"ID":"f937e187-0b79-400e-8e8f-e47afda9ddf5","Type":"ContainerDied","Data":"9f9eee1b764cbabf1c26602aa0de478e6350d96f24249496e8f1211b0e7e8307"} Nov 25 15:16:42 crc kubenswrapper[4965]: I1125 15:16:42.710381 4965 scope.go:117] "RemoveContainer" containerID="0a28b8958e7cc587eec365b916d7c911a2cda2c0f6c2b3f3cba1febbc4504a63" Nov 25 15:16:42 crc kubenswrapper[4965]: I1125 15:16:42.728619 4965 scope.go:117] "RemoveContainer" containerID="ce7a175048222396e35f5de027e9cab535134657aefbba3f2ebf3f93ddbad4c9" Nov 25 15:16:42 crc kubenswrapper[4965]: I1125 15:16:42.748686 4965 scope.go:117] "RemoveContainer" containerID="5c39bf225c95460a03452ef91d7bd483bb67f711addbf2d3b8e5116eca562e11" Nov 25 15:16:42 crc kubenswrapper[4965]: I1125 15:16:42.980302 4965 scope.go:117] "RemoveContainer" containerID="103cfb2c01cd8e0d90e2fbcfda3c6faa053d400b5a58cc15fd8be8e7c9fa8bb5" Nov 25 15:16:43 crc kubenswrapper[4965]: I1125 15:16:43.669841 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-58mtl_eea3820a-3f97-48a7-8b49-def506fe71e2/ovn-controller/0.log" Nov 25 15:16:44 crc kubenswrapper[4965]: I1125 15:16:44.141264 4965 scope.go:117] "RemoveContainer" containerID="0e02d93b35d010e0322543d54170e57162cb3355def07374ff72c728a2483201" Nov 25 15:16:44 crc kubenswrapper[4965]: I1125 15:16:44.159241 4965 scope.go:117] "RemoveContainer" containerID="6a99e897ebf5908cf0e04725494ff9f42e52048f076547047d56f82cdb20a434" Nov 25 15:16:44 crc kubenswrapper[4965]: I1125 15:16:44.677409 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" event={"ID":"f937e187-0b79-400e-8e8f-e47afda9ddf5","Type":"ContainerStarted","Data":"6319a6fe579575ffabc888c0ff0743d56b95654391b62f73f02c60ea99f67754"} Nov 25 15:16:45 crc kubenswrapper[4965]: I1125 15:16:45.684218 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" event={"ID":"f937e187-0b79-400e-8e8f-e47afda9ddf5","Type":"ContainerStarted","Data":"e19514ecb54684155b64fb68037796146eb282319d6cd5610269f77bcafb95ca"} Nov 25 15:16:51 crc kubenswrapper[4965]: I1125 15:16:51.356912 4965 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 15:16:53 crc kubenswrapper[4965]: I1125 15:16:53.260738 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:16:53 crc kubenswrapper[4965]: I1125 15:16:53.261231 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:16:53 crc kubenswrapper[4965]: I1125 15:16:53.741165 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" event={"ID":"f937e187-0b79-400e-8e8f-e47afda9ddf5","Type":"ContainerStarted","Data":"24abebb52293fe60942bd5e5a8f269f98dd66f6883eaa5a88b738e303c3c333a"} Nov 25 15:16:54 crc kubenswrapper[4965]: I1125 15:16:54.760101 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" event={"ID":"f937e187-0b79-400e-8e8f-e47afda9ddf5","Type":"ContainerStarted","Data":"b277450adb7c1f8961f1f7ec89af3ad6e42b6c757ef61d5c461f790a448e856a"} Nov 25 15:16:54 crc kubenswrapper[4965]: I1125 15:16:54.762677 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" event={"ID":"f937e187-0b79-400e-8e8f-e47afda9ddf5","Type":"ContainerStarted","Data":"4123f6a346f59238fe8b1c6504e84c04195a2ca31f3d4ff0e0ed1494d74128e6"} Nov 25 15:16:55 crc kubenswrapper[4965]: I1125 15:16:55.771478 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" event={"ID":"f937e187-0b79-400e-8e8f-e47afda9ddf5","Type":"ContainerStarted","Data":"517239adef8c1a8018fb1ad9359e3c4aec78db59b290e127756069d73b0c80d4"} Nov 25 15:16:57 crc kubenswrapper[4965]: I1125 15:16:57.829564 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-fwgnb" event={"ID":"ae69f1e4-8cbc-45b6-bd5a-ddd31ebde275","Type":"ContainerStarted","Data":"de3cbb41cde711adcaa4bf75aaedc63865cff936fb790d2f63d84e713571416c"} Nov 25 15:16:57 crc kubenswrapper[4965]: I1125 15:16:57.836580 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" event={"ID":"f937e187-0b79-400e-8e8f-e47afda9ddf5","Type":"ContainerStarted","Data":"9206b415d149af18da9d874b59f9f10fbfb44386f928874893b35a87780aa5dc"} Nov 25 15:17:00 crc kubenswrapper[4965]: I1125 15:17:00.882619 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" event={"ID":"f937e187-0b79-400e-8e8f-e47afda9ddf5","Type":"ContainerStarted","Data":"c1067dd633a198ec98603964d6148661cfc3978148f59da6cc83e99806c3bf4e"} Nov 25 15:17:00 crc kubenswrapper[4965]: I1125 15:17:00.883796 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:17:00 crc kubenswrapper[4965]: I1125 15:17:00.883917 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:17:00 crc kubenswrapper[4965]: I1125 15:17:00.883989 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:17:00 crc kubenswrapper[4965]: I1125 15:17:00.913673 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:17:00 crc kubenswrapper[4965]: I1125 15:17:00.919157 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" podStartSLOduration=20.919135776 podStartE2EDuration="20.919135776s" podCreationTimestamp="2025-11-25 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:17:00.915953762 +0000 UTC m=+765.883547508" watchObservedRunningTime="2025-11-25 15:17:00.919135776 +0000 UTC m=+765.886729542" Nov 25 15:17:00 crc kubenswrapper[4965]: I1125 15:17:00.922123 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-fwgnb" podStartSLOduration=4.386068929 podStartE2EDuration="35.922111725s" podCreationTimestamp="2025-11-25 15:16:25 +0000 UTC" firstStartedPulling="2025-11-25 15:16:25.844772051 +0000 UTC m=+730.812365797" lastFinishedPulling="2025-11-25 15:16:57.380814847 +0000 UTC m=+762.348408593" observedRunningTime="2025-11-25 15:16:57.851601055 +0000 UTC m=+762.819194821" watchObservedRunningTime="2025-11-25 15:17:00.922111725 +0000 UTC m=+765.889705491" Nov 25 15:17:00 crc kubenswrapper[4965]: I1125 15:17:00.926415 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:17:03 crc kubenswrapper[4965]: I1125 15:17:03.900221 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-qrbzh" event={"ID":"4be52a9a-2bf2-4d74-8c23-ca6825be424a","Type":"ContainerStarted","Data":"e8f3bcf20ef9ce2dd1b03e04284b6838e703edbfb3da7020ff17138a51faaa0c"} Nov 25 15:17:03 crc kubenswrapper[4965]: I1125 15:17:03.900607 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-qrbzh" Nov 25 15:17:03 crc kubenswrapper[4965]: I1125 15:17:03.902062 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-f8xjq" event={"ID":"d8569680-d671-4080-b2ce-ce6e9f858342","Type":"ContainerStarted","Data":"2ccbceaca4b158e7d17e41b75474274dcd46bdf04669adf9913b6894335e2797"} Nov 25 15:17:03 crc kubenswrapper[4965]: I1125 15:17:03.916501 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-qrbzh" podStartSLOduration=1.463711548 podStartE2EDuration="38.916431113s" podCreationTimestamp="2025-11-25 15:16:25 +0000 UTC" firstStartedPulling="2025-11-25 15:16:25.924635939 +0000 UTC m=+730.892229685" lastFinishedPulling="2025-11-25 15:17:03.377355504 +0000 UTC m=+768.344949250" observedRunningTime="2025-11-25 15:17:03.916239198 +0000 UTC m=+768.883832944" watchObservedRunningTime="2025-11-25 15:17:03.916431113 +0000 UTC m=+768.884024879" Nov 25 15:17:03 crc kubenswrapper[4965]: I1125 15:17:03.934241 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-f8xjq" podStartSLOduration=2.000678254 podStartE2EDuration="38.934220506s" podCreationTimestamp="2025-11-25 15:16:25 +0000 UTC" firstStartedPulling="2025-11-25 15:16:25.89111638 +0000 UTC m=+730.858710126" lastFinishedPulling="2025-11-25 15:17:02.824658632 +0000 UTC m=+767.792252378" observedRunningTime="2025-11-25 15:17:03.933543017 +0000 UTC m=+768.901136763" watchObservedRunningTime="2025-11-25 15:17:03.934220506 +0000 UTC m=+768.901814252" Nov 25 15:17:10 crc kubenswrapper[4965]: I1125 15:17:10.425337 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-28sgz" Nov 25 15:17:10 crc kubenswrapper[4965]: I1125 15:17:10.615134 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-qrbzh" Nov 25 15:17:23 crc kubenswrapper[4965]: I1125 15:17:23.260000 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:17:23 crc kubenswrapper[4965]: I1125 15:17:23.260857 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:17:23 crc kubenswrapper[4965]: I1125 15:17:23.260914 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:17:23 crc kubenswrapper[4965]: I1125 15:17:23.262719 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99a3ddb14dfc84a3500a205d74675321fc95f75084490879b200cf9441df58f4"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:17:23 crc kubenswrapper[4965]: I1125 15:17:23.262789 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://99a3ddb14dfc84a3500a205d74675321fc95f75084490879b200cf9441df58f4" gracePeriod=600 Nov 25 15:17:24 crc kubenswrapper[4965]: I1125 15:17:24.015672 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="99a3ddb14dfc84a3500a205d74675321fc95f75084490879b200cf9441df58f4" exitCode=0 Nov 25 15:17:24 crc kubenswrapper[4965]: I1125 15:17:24.015738 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"99a3ddb14dfc84a3500a205d74675321fc95f75084490879b200cf9441df58f4"} Nov 25 15:17:24 crc kubenswrapper[4965]: I1125 15:17:24.016302 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"00ca3c30c6342c0ded628729d3f70a02171e1d4a4c62216224c37d3f6ce21240"} Nov 25 15:17:24 crc kubenswrapper[4965]: I1125 15:17:24.016336 4965 scope.go:117] "RemoveContainer" containerID="d2f7e1551d78a273523d393b87983439995fab8ee94e0896b828f265d468b25a" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.726046 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56"] Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.727510 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.729036 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.736837 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56"] Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.842967 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wk6g\" (UniqueName: \"kubernetes.io/projected/519a4c2a-d39b-4e97-9634-622a4283f5c9-kube-api-access-8wk6g\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.843065 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.843117 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.944524 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.944818 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wk6g\" (UniqueName: \"kubernetes.io/projected/519a4c2a-d39b-4e97-9634-622a4283f5c9-kube-api-access-8wk6g\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.944935 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.945072 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.945393 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:55 crc kubenswrapper[4965]: I1125 15:17:55.965746 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wk6g\" (UniqueName: \"kubernetes.io/projected/519a4c2a-d39b-4e97-9634-622a4283f5c9-kube-api-access-8wk6g\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:56 crc kubenswrapper[4965]: I1125 15:17:56.042921 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:17:56 crc kubenswrapper[4965]: I1125 15:17:56.291924 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56"] Nov 25 15:17:57 crc kubenswrapper[4965]: I1125 15:17:57.204594 4965 generic.go:334] "Generic (PLEG): container finished" podID="519a4c2a-d39b-4e97-9634-622a4283f5c9" containerID="7afc681131d6616915cea0abb0c350ab23dbae7f7b88518c70064efe7cd51d15" exitCode=0 Nov 25 15:17:57 crc kubenswrapper[4965]: I1125 15:17:57.204677 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" event={"ID":"519a4c2a-d39b-4e97-9634-622a4283f5c9","Type":"ContainerDied","Data":"7afc681131d6616915cea0abb0c350ab23dbae7f7b88518c70064efe7cd51d15"} Nov 25 15:17:57 crc kubenswrapper[4965]: I1125 15:17:57.206328 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" event={"ID":"519a4c2a-d39b-4e97-9634-622a4283f5c9","Type":"ContainerStarted","Data":"d66628a7fd1fdc6900f226aa702d1c86028ef82055f3d188d397017e8d7a8861"} Nov 25 15:17:57 crc kubenswrapper[4965]: I1125 15:17:57.974579 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rncpn"] Nov 25 15:17:57 crc kubenswrapper[4965]: I1125 15:17:57.976060 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:57 crc kubenswrapper[4965]: I1125 15:17:57.980876 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rncpn"] Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.170827 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-594nf\" (UniqueName: \"kubernetes.io/projected/d502f196-5cba-446d-9d52-64151f7d33c0-kube-api-access-594nf\") pod \"redhat-operators-rncpn\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.171163 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-catalog-content\") pod \"redhat-operators-rncpn\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.171312 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-utilities\") pod \"redhat-operators-rncpn\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.272367 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-utilities\") pod \"redhat-operators-rncpn\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.272490 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-594nf\" (UniqueName: \"kubernetes.io/projected/d502f196-5cba-446d-9d52-64151f7d33c0-kube-api-access-594nf\") pod \"redhat-operators-rncpn\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.272529 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-catalog-content\") pod \"redhat-operators-rncpn\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.273165 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-catalog-content\") pod \"redhat-operators-rncpn\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.273452 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-utilities\") pod \"redhat-operators-rncpn\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.293107 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-594nf\" (UniqueName: \"kubernetes.io/projected/d502f196-5cba-446d-9d52-64151f7d33c0-kube-api-access-594nf\") pod \"redhat-operators-rncpn\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.310820 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:17:58 crc kubenswrapper[4965]: I1125 15:17:58.504920 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rncpn"] Nov 25 15:17:59 crc kubenswrapper[4965]: I1125 15:17:59.219742 4965 generic.go:334] "Generic (PLEG): container finished" podID="d502f196-5cba-446d-9d52-64151f7d33c0" containerID="ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385" exitCode=0 Nov 25 15:17:59 crc kubenswrapper[4965]: I1125 15:17:59.220046 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rncpn" event={"ID":"d502f196-5cba-446d-9d52-64151f7d33c0","Type":"ContainerDied","Data":"ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385"} Nov 25 15:17:59 crc kubenswrapper[4965]: I1125 15:17:59.220073 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rncpn" event={"ID":"d502f196-5cba-446d-9d52-64151f7d33c0","Type":"ContainerStarted","Data":"9f7f8a13ea1c0e6e258e0db3ecc110a5445fe6795a1fb5873df35c6749c2a5a1"} Nov 25 15:18:05 crc kubenswrapper[4965]: I1125 15:18:05.265264 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rncpn" event={"ID":"d502f196-5cba-446d-9d52-64151f7d33c0","Type":"ContainerStarted","Data":"28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba"} Nov 25 15:18:05 crc kubenswrapper[4965]: I1125 15:18:05.268377 4965 generic.go:334] "Generic (PLEG): container finished" podID="519a4c2a-d39b-4e97-9634-622a4283f5c9" containerID="f01190a34ef7b3db62e304438ff19ab637ddf1eb63f5867991f9ce9dd8f6cec2" exitCode=0 Nov 25 15:18:05 crc kubenswrapper[4965]: I1125 15:18:05.268482 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" event={"ID":"519a4c2a-d39b-4e97-9634-622a4283f5c9","Type":"ContainerDied","Data":"f01190a34ef7b3db62e304438ff19ab637ddf1eb63f5867991f9ce9dd8f6cec2"} Nov 25 15:18:06 crc kubenswrapper[4965]: I1125 15:18:06.275600 4965 generic.go:334] "Generic (PLEG): container finished" podID="d502f196-5cba-446d-9d52-64151f7d33c0" containerID="28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba" exitCode=0 Nov 25 15:18:06 crc kubenswrapper[4965]: I1125 15:18:06.275667 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rncpn" event={"ID":"d502f196-5cba-446d-9d52-64151f7d33c0","Type":"ContainerDied","Data":"28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba"} Nov 25 15:18:06 crc kubenswrapper[4965]: I1125 15:18:06.278613 4965 generic.go:334] "Generic (PLEG): container finished" podID="519a4c2a-d39b-4e97-9634-622a4283f5c9" containerID="b37fdc54150a44c9161ffd5a06238331c4a7ffdee441333768736bc3f767f28b" exitCode=0 Nov 25 15:18:06 crc kubenswrapper[4965]: I1125 15:18:06.278659 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" event={"ID":"519a4c2a-d39b-4e97-9634-622a4283f5c9","Type":"ContainerDied","Data":"b37fdc54150a44c9161ffd5a06238331c4a7ffdee441333768736bc3f767f28b"} Nov 25 15:18:07 crc kubenswrapper[4965]: I1125 15:18:07.592631 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:18:07 crc kubenswrapper[4965]: I1125 15:18:07.774512 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-util\") pod \"519a4c2a-d39b-4e97-9634-622a4283f5c9\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " Nov 25 15:18:07 crc kubenswrapper[4965]: I1125 15:18:07.779136 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-bundle\") pod \"519a4c2a-d39b-4e97-9634-622a4283f5c9\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " Nov 25 15:18:07 crc kubenswrapper[4965]: I1125 15:18:07.779257 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wk6g\" (UniqueName: \"kubernetes.io/projected/519a4c2a-d39b-4e97-9634-622a4283f5c9-kube-api-access-8wk6g\") pod \"519a4c2a-d39b-4e97-9634-622a4283f5c9\" (UID: \"519a4c2a-d39b-4e97-9634-622a4283f5c9\") " Nov 25 15:18:07 crc kubenswrapper[4965]: I1125 15:18:07.781615 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-bundle" (OuterVolumeSpecName: "bundle") pod "519a4c2a-d39b-4e97-9634-622a4283f5c9" (UID: "519a4c2a-d39b-4e97-9634-622a4283f5c9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:18:07 crc kubenswrapper[4965]: I1125 15:18:07.788768 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/519a4c2a-d39b-4e97-9634-622a4283f5c9-kube-api-access-8wk6g" (OuterVolumeSpecName: "kube-api-access-8wk6g") pod "519a4c2a-d39b-4e97-9634-622a4283f5c9" (UID: "519a4c2a-d39b-4e97-9634-622a4283f5c9"). InnerVolumeSpecName "kube-api-access-8wk6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:18:07 crc kubenswrapper[4965]: I1125 15:18:07.789161 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-util" (OuterVolumeSpecName: "util") pod "519a4c2a-d39b-4e97-9634-622a4283f5c9" (UID: "519a4c2a-d39b-4e97-9634-622a4283f5c9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:18:07 crc kubenswrapper[4965]: I1125 15:18:07.881498 4965 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:07 crc kubenswrapper[4965]: I1125 15:18:07.881570 4965 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/519a4c2a-d39b-4e97-9634-622a4283f5c9-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:07 crc kubenswrapper[4965]: I1125 15:18:07.881592 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wk6g\" (UniqueName: \"kubernetes.io/projected/519a4c2a-d39b-4e97-9634-622a4283f5c9-kube-api-access-8wk6g\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:08 crc kubenswrapper[4965]: I1125 15:18:08.306536 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" event={"ID":"519a4c2a-d39b-4e97-9634-622a4283f5c9","Type":"ContainerDied","Data":"d66628a7fd1fdc6900f226aa702d1c86028ef82055f3d188d397017e8d7a8861"} Nov 25 15:18:08 crc kubenswrapper[4965]: I1125 15:18:08.306649 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d66628a7fd1fdc6900f226aa702d1c86028ef82055f3d188d397017e8d7a8861" Nov 25 15:18:08 crc kubenswrapper[4965]: I1125 15:18:08.306689 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56" Nov 25 15:18:09 crc kubenswrapper[4965]: I1125 15:18:09.313540 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rncpn" event={"ID":"d502f196-5cba-446d-9d52-64151f7d33c0","Type":"ContainerStarted","Data":"cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b"} Nov 25 15:18:09 crc kubenswrapper[4965]: I1125 15:18:09.331908 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rncpn" podStartSLOduration=3.5708837989999997 podStartE2EDuration="12.331888915s" podCreationTimestamp="2025-11-25 15:17:57 +0000 UTC" firstStartedPulling="2025-11-25 15:17:59.221627189 +0000 UTC m=+824.189220935" lastFinishedPulling="2025-11-25 15:18:07.982632305 +0000 UTC m=+832.950226051" observedRunningTime="2025-11-25 15:18:09.330274991 +0000 UTC m=+834.297868737" watchObservedRunningTime="2025-11-25 15:18:09.331888915 +0000 UTC m=+834.299482661" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.336934 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-mdzg9"] Nov 25 15:18:12 crc kubenswrapper[4965]: E1125 15:18:12.337189 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519a4c2a-d39b-4e97-9634-622a4283f5c9" containerName="util" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.337204 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="519a4c2a-d39b-4e97-9634-622a4283f5c9" containerName="util" Nov 25 15:18:12 crc kubenswrapper[4965]: E1125 15:18:12.337216 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519a4c2a-d39b-4e97-9634-622a4283f5c9" containerName="pull" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.337223 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="519a4c2a-d39b-4e97-9634-622a4283f5c9" containerName="pull" Nov 25 15:18:12 crc kubenswrapper[4965]: E1125 15:18:12.337233 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519a4c2a-d39b-4e97-9634-622a4283f5c9" containerName="extract" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.337240 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="519a4c2a-d39b-4e97-9634-622a4283f5c9" containerName="extract" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.337351 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="519a4c2a-d39b-4e97-9634-622a4283f5c9" containerName="extract" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.337783 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-mdzg9" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.340071 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-dc8pt" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.340277 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.340587 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.349704 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-mdzg9"] Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.434282 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sls44\" (UniqueName: \"kubernetes.io/projected/8bfc4cc3-37e8-4034-89fe-922f3d3fd12d-kube-api-access-sls44\") pod \"nmstate-operator-557fdffb88-mdzg9\" (UID: \"8bfc4cc3-37e8-4034-89fe-922f3d3fd12d\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-mdzg9" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.535916 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sls44\" (UniqueName: \"kubernetes.io/projected/8bfc4cc3-37e8-4034-89fe-922f3d3fd12d-kube-api-access-sls44\") pod \"nmstate-operator-557fdffb88-mdzg9\" (UID: \"8bfc4cc3-37e8-4034-89fe-922f3d3fd12d\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-mdzg9" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.554132 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sls44\" (UniqueName: \"kubernetes.io/projected/8bfc4cc3-37e8-4034-89fe-922f3d3fd12d-kube-api-access-sls44\") pod \"nmstate-operator-557fdffb88-mdzg9\" (UID: \"8bfc4cc3-37e8-4034-89fe-922f3d3fd12d\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-mdzg9" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.651058 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-mdzg9" Nov 25 15:18:12 crc kubenswrapper[4965]: I1125 15:18:12.860307 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-mdzg9"] Nov 25 15:18:12 crc kubenswrapper[4965]: W1125 15:18:12.868166 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bfc4cc3_37e8_4034_89fe_922f3d3fd12d.slice/crio-19ab74cbfc2087ef7bbc4524474e9754b704d40d0274a468a4dbf46adfafc759 WatchSource:0}: Error finding container 19ab74cbfc2087ef7bbc4524474e9754b704d40d0274a468a4dbf46adfafc759: Status 404 returned error can't find the container with id 19ab74cbfc2087ef7bbc4524474e9754b704d40d0274a468a4dbf46adfafc759 Nov 25 15:18:13 crc kubenswrapper[4965]: I1125 15:18:13.348951 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-mdzg9" event={"ID":"8bfc4cc3-37e8-4034-89fe-922f3d3fd12d","Type":"ContainerStarted","Data":"19ab74cbfc2087ef7bbc4524474e9754b704d40d0274a468a4dbf46adfafc759"} Nov 25 15:18:18 crc kubenswrapper[4965]: I1125 15:18:18.311817 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:18:18 crc kubenswrapper[4965]: I1125 15:18:18.312185 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:18:18 crc kubenswrapper[4965]: I1125 15:18:18.352779 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:18:18 crc kubenswrapper[4965]: I1125 15:18:18.431020 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.482007 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rncpn"] Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.482504 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rncpn" podUID="d502f196-5cba-446d-9d52-64151f7d33c0" containerName="registry-server" containerID="cri-o://cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b" gracePeriod=2 Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.831125 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.856527 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-utilities" (OuterVolumeSpecName: "utilities") pod "d502f196-5cba-446d-9d52-64151f7d33c0" (UID: "d502f196-5cba-446d-9d52-64151f7d33c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.857440 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-utilities\") pod \"d502f196-5cba-446d-9d52-64151f7d33c0\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.857562 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-594nf\" (UniqueName: \"kubernetes.io/projected/d502f196-5cba-446d-9d52-64151f7d33c0-kube-api-access-594nf\") pod \"d502f196-5cba-446d-9d52-64151f7d33c0\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.857654 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-catalog-content\") pod \"d502f196-5cba-446d-9d52-64151f7d33c0\" (UID: \"d502f196-5cba-446d-9d52-64151f7d33c0\") " Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.859721 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.865027 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d502f196-5cba-446d-9d52-64151f7d33c0-kube-api-access-594nf" (OuterVolumeSpecName: "kube-api-access-594nf") pod "d502f196-5cba-446d-9d52-64151f7d33c0" (UID: "d502f196-5cba-446d-9d52-64151f7d33c0"). InnerVolumeSpecName "kube-api-access-594nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.960350 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d502f196-5cba-446d-9d52-64151f7d33c0" (UID: "d502f196-5cba-446d-9d52-64151f7d33c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.960760 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502f196-5cba-446d-9d52-64151f7d33c0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:20 crc kubenswrapper[4965]: I1125 15:18:20.960781 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-594nf\" (UniqueName: \"kubernetes.io/projected/d502f196-5cba-446d-9d52-64151f7d33c0-kube-api-access-594nf\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.408828 4965 generic.go:334] "Generic (PLEG): container finished" podID="d502f196-5cba-446d-9d52-64151f7d33c0" containerID="cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b" exitCode=0 Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.408934 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rncpn" event={"ID":"d502f196-5cba-446d-9d52-64151f7d33c0","Type":"ContainerDied","Data":"cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b"} Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.409007 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rncpn" event={"ID":"d502f196-5cba-446d-9d52-64151f7d33c0","Type":"ContainerDied","Data":"9f7f8a13ea1c0e6e258e0db3ecc110a5445fe6795a1fb5873df35c6749c2a5a1"} Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.409031 4965 scope.go:117] "RemoveContainer" containerID="cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.409165 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rncpn" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.414144 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-mdzg9" event={"ID":"8bfc4cc3-37e8-4034-89fe-922f3d3fd12d","Type":"ContainerStarted","Data":"a92efe5e926279fd6f8629757e3f160e58587e28ec0d0af46a7034219a5f515a"} Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.430775 4965 scope.go:117] "RemoveContainer" containerID="28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.466235 4965 scope.go:117] "RemoveContainer" containerID="ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.496661 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-mdzg9" podStartSLOduration=2.226622647 podStartE2EDuration="9.496641392s" podCreationTimestamp="2025-11-25 15:18:12 +0000 UTC" firstStartedPulling="2025-11-25 15:18:12.870915557 +0000 UTC m=+837.838509303" lastFinishedPulling="2025-11-25 15:18:20.140934292 +0000 UTC m=+845.108528048" observedRunningTime="2025-11-25 15:18:21.452608915 +0000 UTC m=+846.420202661" watchObservedRunningTime="2025-11-25 15:18:21.496641392 +0000 UTC m=+846.464235138" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.498490 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rncpn"] Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.505070 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rncpn"] Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.505369 4965 scope.go:117] "RemoveContainer" containerID="cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b" Nov 25 15:18:21 crc kubenswrapper[4965]: E1125 15:18:21.506125 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b\": container with ID starting with cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b not found: ID does not exist" containerID="cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.506168 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b"} err="failed to get container status \"cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b\": rpc error: code = NotFound desc = could not find container \"cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b\": container with ID starting with cbccee3f98b6b4a4ac692fca27fe505ea9656e14f7c4ec238b498e839605e53b not found: ID does not exist" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.506197 4965 scope.go:117] "RemoveContainer" containerID="28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba" Nov 25 15:18:21 crc kubenswrapper[4965]: E1125 15:18:21.506493 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba\": container with ID starting with 28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba not found: ID does not exist" containerID="28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.506527 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba"} err="failed to get container status \"28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba\": rpc error: code = NotFound desc = could not find container \"28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba\": container with ID starting with 28e41518e0576fb356094da7d80aa926e7e8072d5a7946a44450afa59c09f2ba not found: ID does not exist" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.506541 4965 scope.go:117] "RemoveContainer" containerID="ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385" Nov 25 15:18:21 crc kubenswrapper[4965]: E1125 15:18:21.506707 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385\": container with ID starting with ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385 not found: ID does not exist" containerID="ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385" Nov 25 15:18:21 crc kubenswrapper[4965]: I1125 15:18:21.506727 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385"} err="failed to get container status \"ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385\": rpc error: code = NotFound desc = could not find container \"ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385\": container with ID starting with ec4d19e51cb33a4f78aff883ee32d09322cacba2827bfea1deb667bdac79f385 not found: ID does not exist" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.393923 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f"] Nov 25 15:18:22 crc kubenswrapper[4965]: E1125 15:18:22.394159 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502f196-5cba-446d-9d52-64151f7d33c0" containerName="extract-utilities" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.394172 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502f196-5cba-446d-9d52-64151f7d33c0" containerName="extract-utilities" Nov 25 15:18:22 crc kubenswrapper[4965]: E1125 15:18:22.394184 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502f196-5cba-446d-9d52-64151f7d33c0" containerName="registry-server" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.394190 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502f196-5cba-446d-9d52-64151f7d33c0" containerName="registry-server" Nov 25 15:18:22 crc kubenswrapper[4965]: E1125 15:18:22.394203 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502f196-5cba-446d-9d52-64151f7d33c0" containerName="extract-content" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.394209 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502f196-5cba-446d-9d52-64151f7d33c0" containerName="extract-content" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.394295 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="d502f196-5cba-446d-9d52-64151f7d33c0" containerName="registry-server" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.394804 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.396788 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wdp4b" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.414393 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f"] Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.446610 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc"] Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.447602 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.452789 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.473326 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc"] Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.478757 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c59704bb-64d1-4282-ad60-648655cf9bf3-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7sxnc\" (UID: \"c59704bb-64d1-4282-ad60-648655cf9bf3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.478854 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6v5j\" (UniqueName: \"kubernetes.io/projected/c72314c6-eff3-4dd6-aa24-b6831b35580f-kube-api-access-f6v5j\") pod \"nmstate-metrics-5dcf9c57c5-9c85f\" (UID: \"c72314c6-eff3-4dd6-aa24-b6831b35580f\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.478891 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbgpb\" (UniqueName: \"kubernetes.io/projected/c59704bb-64d1-4282-ad60-648655cf9bf3-kube-api-access-qbgpb\") pod \"nmstate-webhook-6b89b748d8-7sxnc\" (UID: \"c59704bb-64d1-4282-ad60-648655cf9bf3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.483684 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-z4cdf"] Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.486198 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.565459 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn"] Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.566233 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.569014 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-k2lgw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.569176 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.569297 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.579342 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/afba2902-5375-4150-a501-282d517200e3-ovs-socket\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.579418 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfrmx\" (UniqueName: \"kubernetes.io/projected/04e7005e-527f-4328-8981-7614b841a91b-kube-api-access-nfrmx\") pod \"nmstate-console-plugin-5874bd7bc5-hn6qn\" (UID: \"04e7005e-527f-4328-8981-7614b841a91b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.579457 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6v5j\" (UniqueName: \"kubernetes.io/projected/c72314c6-eff3-4dd6-aa24-b6831b35580f-kube-api-access-f6v5j\") pod \"nmstate-metrics-5dcf9c57c5-9c85f\" (UID: \"c72314c6-eff3-4dd6-aa24-b6831b35580f\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.579489 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/afba2902-5375-4150-a501-282d517200e3-dbus-socket\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.579510 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04e7005e-527f-4328-8981-7614b841a91b-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-hn6qn\" (UID: \"04e7005e-527f-4328-8981-7614b841a91b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.579530 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/afba2902-5375-4150-a501-282d517200e3-nmstate-lock\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.579563 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbgpb\" (UniqueName: \"kubernetes.io/projected/c59704bb-64d1-4282-ad60-648655cf9bf3-kube-api-access-qbgpb\") pod \"nmstate-webhook-6b89b748d8-7sxnc\" (UID: \"c59704bb-64d1-4282-ad60-648655cf9bf3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.579584 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04e7005e-527f-4328-8981-7614b841a91b-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-hn6qn\" (UID: \"04e7005e-527f-4328-8981-7614b841a91b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.579612 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dx5f\" (UniqueName: \"kubernetes.io/projected/afba2902-5375-4150-a501-282d517200e3-kube-api-access-7dx5f\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.579636 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c59704bb-64d1-4282-ad60-648655cf9bf3-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7sxnc\" (UID: \"c59704bb-64d1-4282-ad60-648655cf9bf3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:22 crc kubenswrapper[4965]: E1125 15:18:22.579755 4965 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Nov 25 15:18:22 crc kubenswrapper[4965]: E1125 15:18:22.579813 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c59704bb-64d1-4282-ad60-648655cf9bf3-tls-key-pair podName:c59704bb-64d1-4282-ad60-648655cf9bf3 nodeName:}" failed. No retries permitted until 2025-11-25 15:18:23.07979339 +0000 UTC m=+848.047387136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/c59704bb-64d1-4282-ad60-648655cf9bf3-tls-key-pair") pod "nmstate-webhook-6b89b748d8-7sxnc" (UID: "c59704bb-64d1-4282-ad60-648655cf9bf3") : secret "openshift-nmstate-webhook" not found Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.583330 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn"] Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.603519 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6v5j\" (UniqueName: \"kubernetes.io/projected/c72314c6-eff3-4dd6-aa24-b6831b35580f-kube-api-access-f6v5j\") pod \"nmstate-metrics-5dcf9c57c5-9c85f\" (UID: \"c72314c6-eff3-4dd6-aa24-b6831b35580f\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.614932 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbgpb\" (UniqueName: \"kubernetes.io/projected/c59704bb-64d1-4282-ad60-648655cf9bf3-kube-api-access-qbgpb\") pod \"nmstate-webhook-6b89b748d8-7sxnc\" (UID: \"c59704bb-64d1-4282-ad60-648655cf9bf3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.681089 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/afba2902-5375-4150-a501-282d517200e3-ovs-socket\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.681192 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/afba2902-5375-4150-a501-282d517200e3-ovs-socket\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.681201 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfrmx\" (UniqueName: \"kubernetes.io/projected/04e7005e-527f-4328-8981-7614b841a91b-kube-api-access-nfrmx\") pod \"nmstate-console-plugin-5874bd7bc5-hn6qn\" (UID: \"04e7005e-527f-4328-8981-7614b841a91b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.681587 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/afba2902-5375-4150-a501-282d517200e3-dbus-socket\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.681857 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/afba2902-5375-4150-a501-282d517200e3-dbus-socket\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.681621 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04e7005e-527f-4328-8981-7614b841a91b-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-hn6qn\" (UID: \"04e7005e-527f-4328-8981-7614b841a91b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.681932 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/afba2902-5375-4150-a501-282d517200e3-nmstate-lock\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.682867 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04e7005e-527f-4328-8981-7614b841a91b-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-hn6qn\" (UID: \"04e7005e-527f-4328-8981-7614b841a91b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.683360 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dx5f\" (UniqueName: \"kubernetes.io/projected/afba2902-5375-4150-a501-282d517200e3-kube-api-access-7dx5f\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.682072 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/afba2902-5375-4150-a501-282d517200e3-nmstate-lock\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.682803 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04e7005e-527f-4328-8981-7614b841a91b-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-hn6qn\" (UID: \"04e7005e-527f-4328-8981-7614b841a91b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.688314 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04e7005e-527f-4328-8981-7614b841a91b-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-hn6qn\" (UID: \"04e7005e-527f-4328-8981-7614b841a91b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.708075 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.711589 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfrmx\" (UniqueName: \"kubernetes.io/projected/04e7005e-527f-4328-8981-7614b841a91b-kube-api-access-nfrmx\") pod \"nmstate-console-plugin-5874bd7bc5-hn6qn\" (UID: \"04e7005e-527f-4328-8981-7614b841a91b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.715097 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dx5f\" (UniqueName: \"kubernetes.io/projected/afba2902-5375-4150-a501-282d517200e3-kube-api-access-7dx5f\") pod \"nmstate-handler-z4cdf\" (UID: \"afba2902-5375-4150-a501-282d517200e3\") " pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.780272 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d502f196-5cba-446d-9d52-64151f7d33c0" path="/var/lib/kubelet/pods/d502f196-5cba-446d-9d52-64151f7d33c0/volumes" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.810740 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.824223 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7577cbd5f4-9lslw"] Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.824849 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.847908 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7577cbd5f4-9lslw"] Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.885329 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-console-config\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.885390 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-console-serving-cert\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.885412 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-service-ca\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.885426 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-trusted-ca-bundle\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.885467 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-console-oauth-config\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.885486 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4fhr\" (UniqueName: \"kubernetes.io/projected/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-kube-api-access-h4fhr\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.885504 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-oauth-serving-cert\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.891914 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.986009 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-console-config\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.986068 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-console-serving-cert\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.986091 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-service-ca\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.986106 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-trusted-ca-bundle\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.986131 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-console-oauth-config\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.986153 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4fhr\" (UniqueName: \"kubernetes.io/projected/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-kube-api-access-h4fhr\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.986173 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-oauth-serving-cert\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.987073 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-oauth-serving-cert\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.988952 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-trusted-ca-bundle\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.990711 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-console-config\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.990791 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-service-ca\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.993112 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-console-serving-cert\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:22 crc kubenswrapper[4965]: I1125 15:18:22.995555 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-console-oauth-config\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.003736 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4fhr\" (UniqueName: \"kubernetes.io/projected/b4ce97d9-0e42-4e9c-be90-51287d0a07e3-kube-api-access-h4fhr\") pod \"console-7577cbd5f4-9lslw\" (UID: \"b4ce97d9-0e42-4e9c-be90-51287d0a07e3\") " pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.087248 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c59704bb-64d1-4282-ad60-648655cf9bf3-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7sxnc\" (UID: \"c59704bb-64d1-4282-ad60-648655cf9bf3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.093777 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c59704bb-64d1-4282-ad60-648655cf9bf3-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7sxnc\" (UID: \"c59704bb-64d1-4282-ad60-648655cf9bf3\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.094672 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.097444 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn"] Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.128467 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f"] Nov 25 15:18:23 crc kubenswrapper[4965]: W1125 15:18:23.133586 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72314c6_eff3_4dd6_aa24_b6831b35580f.slice/crio-abe036a2a809375a491163d2ea3639c96fcf0d57afc3cdf968ce40cecae741af WatchSource:0}: Error finding container abe036a2a809375a491163d2ea3639c96fcf0d57afc3cdf968ce40cecae741af: Status 404 returned error can't find the container with id abe036a2a809375a491163d2ea3639c96fcf0d57afc3cdf968ce40cecae741af Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.147824 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.309045 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc"] Nov 25 15:18:23 crc kubenswrapper[4965]: W1125 15:18:23.311033 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc59704bb_64d1_4282_ad60_648655cf9bf3.slice/crio-336ffb5a499cf9315599de6bfe9a25120da8f80d8c17b1150ec39443f091725c WatchSource:0}: Error finding container 336ffb5a499cf9315599de6bfe9a25120da8f80d8c17b1150ec39443f091725c: Status 404 returned error can't find the container with id 336ffb5a499cf9315599de6bfe9a25120da8f80d8c17b1150ec39443f091725c Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.455923 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f" event={"ID":"c72314c6-eff3-4dd6-aa24-b6831b35580f","Type":"ContainerStarted","Data":"abe036a2a809375a491163d2ea3639c96fcf0d57afc3cdf968ce40cecae741af"} Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.457586 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z4cdf" event={"ID":"afba2902-5375-4150-a501-282d517200e3","Type":"ContainerStarted","Data":"e041d7238f4e8c53d13de3879f4c1ccaddb55a70ed0e6b0adfcd33689521f008"} Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.458875 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" event={"ID":"c59704bb-64d1-4282-ad60-648655cf9bf3","Type":"ContainerStarted","Data":"336ffb5a499cf9315599de6bfe9a25120da8f80d8c17b1150ec39443f091725c"} Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.460451 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" event={"ID":"04e7005e-527f-4328-8981-7614b841a91b","Type":"ContainerStarted","Data":"2489b0b05a4a23e13757de3dabeca33e93eecdde55fb7950a2820db66565e9ef"} Nov 25 15:18:23 crc kubenswrapper[4965]: I1125 15:18:23.627498 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7577cbd5f4-9lslw"] Nov 25 15:18:24 crc kubenswrapper[4965]: I1125 15:18:24.468217 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7577cbd5f4-9lslw" event={"ID":"b4ce97d9-0e42-4e9c-be90-51287d0a07e3","Type":"ContainerStarted","Data":"cda72ef96ab0f1d2ba68d454ebed34aad726f16f17f340f5afb30f16b1323416"} Nov 25 15:18:24 crc kubenswrapper[4965]: I1125 15:18:24.468567 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7577cbd5f4-9lslw" event={"ID":"b4ce97d9-0e42-4e9c-be90-51287d0a07e3","Type":"ContainerStarted","Data":"d9e71751e47ceac12a639b11bfb3561745022f846f4308ac4b3540a5be7de9ec"} Nov 25 15:18:24 crc kubenswrapper[4965]: I1125 15:18:24.487481 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7577cbd5f4-9lslw" podStartSLOduration=2.487464112 podStartE2EDuration="2.487464112s" podCreationTimestamp="2025-11-25 15:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:18:24.485545741 +0000 UTC m=+849.453139497" watchObservedRunningTime="2025-11-25 15:18:24.487464112 +0000 UTC m=+849.455057858" Nov 25 15:18:27 crc kubenswrapper[4965]: I1125 15:18:27.488268 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" event={"ID":"04e7005e-527f-4328-8981-7614b841a91b","Type":"ContainerStarted","Data":"0cac88942fb242092d982f3b720a48697c900c339563f9f4d6ca97f2310e0c79"} Nov 25 15:18:27 crc kubenswrapper[4965]: I1125 15:18:27.490655 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f" event={"ID":"c72314c6-eff3-4dd6-aa24-b6831b35580f","Type":"ContainerStarted","Data":"523f3aaebd0fcd94b9cbc0ee42232a383d0ce73980421035612676380d1ed4e7"} Nov 25 15:18:27 crc kubenswrapper[4965]: I1125 15:18:27.495982 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z4cdf" event={"ID":"afba2902-5375-4150-a501-282d517200e3","Type":"ContainerStarted","Data":"8c45b3ea06700b5c45877f6a2eae9206d3f18f7e28a65d959bdc1c0c701e4b1f"} Nov 25 15:18:27 crc kubenswrapper[4965]: I1125 15:18:27.496091 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:27 crc kubenswrapper[4965]: I1125 15:18:27.497640 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" event={"ID":"c59704bb-64d1-4282-ad60-648655cf9bf3","Type":"ContainerStarted","Data":"ca48ac5d45427b71deed8c634f98d26adf029bc628ff841a38a596c221b8110d"} Nov 25 15:18:27 crc kubenswrapper[4965]: I1125 15:18:27.497837 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:27 crc kubenswrapper[4965]: I1125 15:18:27.510950 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-hn6qn" podStartSLOduration=2.308774784 podStartE2EDuration="5.510924912s" podCreationTimestamp="2025-11-25 15:18:22 +0000 UTC" firstStartedPulling="2025-11-25 15:18:23.105553299 +0000 UTC m=+848.073147065" lastFinishedPulling="2025-11-25 15:18:26.307703447 +0000 UTC m=+851.275297193" observedRunningTime="2025-11-25 15:18:27.509810591 +0000 UTC m=+852.477404367" watchObservedRunningTime="2025-11-25 15:18:27.510924912 +0000 UTC m=+852.478518698" Nov 25 15:18:27 crc kubenswrapper[4965]: I1125 15:18:27.537778 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-z4cdf" podStartSLOduration=2.108925908 podStartE2EDuration="5.537756711s" podCreationTimestamp="2025-11-25 15:18:22 +0000 UTC" firstStartedPulling="2025-11-25 15:18:22.87652721 +0000 UTC m=+847.844120956" lastFinishedPulling="2025-11-25 15:18:26.305358013 +0000 UTC m=+851.272951759" observedRunningTime="2025-11-25 15:18:27.533273249 +0000 UTC m=+852.500866995" watchObservedRunningTime="2025-11-25 15:18:27.537756711 +0000 UTC m=+852.505350457" Nov 25 15:18:27 crc kubenswrapper[4965]: I1125 15:18:27.560017 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" podStartSLOduration=2.54032128 podStartE2EDuration="5.559997476s" podCreationTimestamp="2025-11-25 15:18:22 +0000 UTC" firstStartedPulling="2025-11-25 15:18:23.312054275 +0000 UTC m=+848.279648021" lastFinishedPulling="2025-11-25 15:18:26.331730471 +0000 UTC m=+851.299324217" observedRunningTime="2025-11-25 15:18:27.55720677 +0000 UTC m=+852.524800526" watchObservedRunningTime="2025-11-25 15:18:27.559997476 +0000 UTC m=+852.527591242" Nov 25 15:18:30 crc kubenswrapper[4965]: I1125 15:18:29.984624 4965 patch_prober.go:28] interesting pod/router-default-5444994796-82czk container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 15:18:30 crc kubenswrapper[4965]: I1125 15:18:29.985036 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-82czk" podUID="09c34009-3606-4b93-9f5f-c8a478aee354" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:18:31 crc kubenswrapper[4965]: I1125 15:18:31.068763 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f" event={"ID":"c72314c6-eff3-4dd6-aa24-b6831b35580f","Type":"ContainerStarted","Data":"b161b9e8be1792e48e93c9e12d0a99827a4ba142bcc185356e47efbef6e610c3"} Nov 25 15:18:31 crc kubenswrapper[4965]: I1125 15:18:31.102362 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-9c85f" podStartSLOduration=3.760757301 podStartE2EDuration="9.102342166s" podCreationTimestamp="2025-11-25 15:18:22 +0000 UTC" firstStartedPulling="2025-11-25 15:18:23.135937205 +0000 UTC m=+848.103530971" lastFinishedPulling="2025-11-25 15:18:28.47752209 +0000 UTC m=+853.445115836" observedRunningTime="2025-11-25 15:18:31.097488234 +0000 UTC m=+856.065082010" watchObservedRunningTime="2025-11-25 15:18:31.102342166 +0000 UTC m=+856.069935922" Nov 25 15:18:32 crc kubenswrapper[4965]: I1125 15:18:32.847778 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-z4cdf" Nov 25 15:18:33 crc kubenswrapper[4965]: I1125 15:18:33.148192 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:33 crc kubenswrapper[4965]: I1125 15:18:33.148245 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:33 crc kubenswrapper[4965]: I1125 15:18:33.155561 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:34 crc kubenswrapper[4965]: I1125 15:18:34.101798 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7577cbd5f4-9lslw" Nov 25 15:18:34 crc kubenswrapper[4965]: I1125 15:18:34.182517 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pdprk"] Nov 25 15:18:43 crc kubenswrapper[4965]: I1125 15:18:43.100880 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7sxnc" Nov 25 15:18:58 crc kubenswrapper[4965]: I1125 15:18:58.799777 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k"] Nov 25 15:18:58 crc kubenswrapper[4965]: I1125 15:18:58.801628 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:58 crc kubenswrapper[4965]: I1125 15:18:58.803889 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 25 15:18:58 crc kubenswrapper[4965]: I1125 15:18:58.813404 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k"] Nov 25 15:18:58 crc kubenswrapper[4965]: I1125 15:18:58.977472 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:58 crc kubenswrapper[4965]: I1125 15:18:58.977826 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:58 crc kubenswrapper[4965]: I1125 15:18:58.977873 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28dq9\" (UniqueName: \"kubernetes.io/projected/464b1b3b-03a0-41e6-842a-446cac908eea-kube-api-access-28dq9\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.078800 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.078846 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.078870 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28dq9\" (UniqueName: \"kubernetes.io/projected/464b1b3b-03a0-41e6-842a-446cac908eea-kube-api-access-28dq9\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.079359 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.079369 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.096772 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28dq9\" (UniqueName: \"kubernetes.io/projected/464b1b3b-03a0-41e6-842a-446cac908eea-kube-api-access-28dq9\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.147471 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.235179 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-pdprk" podUID="370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" containerName="console" containerID="cri-o://8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067" gracePeriod=15 Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.561030 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k"] Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.659917 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pdprk_370d7098-0a4b-4aa0-8fb2-da7823f4d2d0/console/0.log" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.660018 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.689352 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-service-ca\") pod \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.689433 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-serving-cert\") pod \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.689507 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-trusted-ca-bundle\") pod \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.689541 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfllt\" (UniqueName: \"kubernetes.io/projected/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-kube-api-access-rfllt\") pod \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.689568 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-config\") pod \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.689655 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-oauth-config\") pod \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.689683 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-oauth-serving-cert\") pod \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\" (UID: \"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0\") " Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.690995 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" (UID: "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.691067 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-service-ca" (OuterVolumeSpecName: "service-ca") pod "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" (UID: "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.691342 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-config" (OuterVolumeSpecName: "console-config") pod "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" (UID: "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.691785 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" (UID: "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.698576 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" (UID: "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.699916 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-kube-api-access-rfllt" (OuterVolumeSpecName: "kube-api-access-rfllt") pod "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" (UID: "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0"). InnerVolumeSpecName "kube-api-access-rfllt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.700408 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" (UID: "370d7098-0a4b-4aa0-8fb2-da7823f4d2d0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.790707 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfllt\" (UniqueName: \"kubernetes.io/projected/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-kube-api-access-rfllt\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.790739 4965 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.790747 4965 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.790755 4965 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.790765 4965 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.790777 4965 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:18:59 crc kubenswrapper[4965]: I1125 15:18:59.790787 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.267444 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pdprk_370d7098-0a4b-4aa0-8fb2-da7823f4d2d0/console/0.log" Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.267497 4965 generic.go:334] "Generic (PLEG): container finished" podID="370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" containerID="8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067" exitCode=2 Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.267592 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pdprk" Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.270147 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pdprk" event={"ID":"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0","Type":"ContainerDied","Data":"8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067"} Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.270209 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pdprk" event={"ID":"370d7098-0a4b-4aa0-8fb2-da7823f4d2d0","Type":"ContainerDied","Data":"865c8ba5942ac9f1ed321c656b1ec762a15c73178aad0fc7da19476289a38a21"} Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.270256 4965 scope.go:117] "RemoveContainer" containerID="8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067" Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.274323 4965 generic.go:334] "Generic (PLEG): container finished" podID="464b1b3b-03a0-41e6-842a-446cac908eea" containerID="63cf8d0d6848f9b08e3410be12d04c6aa11d75f654c9c0985688e0a7c251f58c" exitCode=0 Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.274364 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" event={"ID":"464b1b3b-03a0-41e6-842a-446cac908eea","Type":"ContainerDied","Data":"63cf8d0d6848f9b08e3410be12d04c6aa11d75f654c9c0985688e0a7c251f58c"} Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.274392 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" event={"ID":"464b1b3b-03a0-41e6-842a-446cac908eea","Type":"ContainerStarted","Data":"8df24488c13708abaa05fad812a9ff7ae540aa31552187bc3d18f7c44303b3d3"} Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.288214 4965 scope.go:117] "RemoveContainer" containerID="8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067" Nov 25 15:19:00 crc kubenswrapper[4965]: E1125 15:19:00.288605 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067\": container with ID starting with 8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067 not found: ID does not exist" containerID="8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067" Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.288659 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067"} err="failed to get container status \"8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067\": rpc error: code = NotFound desc = could not find container \"8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067\": container with ID starting with 8215be3048c09e7fdf67ef175c5bf5939b285458cebf1e5a48b114ec5f1a1067 not found: ID does not exist" Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.315808 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pdprk"] Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.320188 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-pdprk"] Nov 25 15:19:00 crc kubenswrapper[4965]: I1125 15:19:00.791481 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" path="/var/lib/kubelet/pods/370d7098-0a4b-4aa0-8fb2-da7823f4d2d0/volumes" Nov 25 15:19:05 crc kubenswrapper[4965]: I1125 15:19:05.306098 4965 generic.go:334] "Generic (PLEG): container finished" podID="464b1b3b-03a0-41e6-842a-446cac908eea" containerID="268b84eba5d19e53d77157fe2c32ef9edc8ad42f0f545d90f4a4e75d0c1b4090" exitCode=0 Nov 25 15:19:05 crc kubenswrapper[4965]: I1125 15:19:05.306159 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" event={"ID":"464b1b3b-03a0-41e6-842a-446cac908eea","Type":"ContainerDied","Data":"268b84eba5d19e53d77157fe2c32ef9edc8ad42f0f545d90f4a4e75d0c1b4090"} Nov 25 15:19:06 crc kubenswrapper[4965]: I1125 15:19:06.313379 4965 generic.go:334] "Generic (PLEG): container finished" podID="464b1b3b-03a0-41e6-842a-446cac908eea" containerID="bbda01913312996a105017e504e54a16eec60dd98bf3762429da0456fbbd2f16" exitCode=0 Nov 25 15:19:06 crc kubenswrapper[4965]: I1125 15:19:06.313903 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" event={"ID":"464b1b3b-03a0-41e6-842a-446cac908eea","Type":"ContainerDied","Data":"bbda01913312996a105017e504e54a16eec60dd98bf3762429da0456fbbd2f16"} Nov 25 15:19:07 crc kubenswrapper[4965]: I1125 15:19:07.537377 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:19:07 crc kubenswrapper[4965]: I1125 15:19:07.583460 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-util\") pod \"464b1b3b-03a0-41e6-842a-446cac908eea\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " Nov 25 15:19:07 crc kubenswrapper[4965]: I1125 15:19:07.583516 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-bundle\") pod \"464b1b3b-03a0-41e6-842a-446cac908eea\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " Nov 25 15:19:07 crc kubenswrapper[4965]: I1125 15:19:07.583579 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28dq9\" (UniqueName: \"kubernetes.io/projected/464b1b3b-03a0-41e6-842a-446cac908eea-kube-api-access-28dq9\") pod \"464b1b3b-03a0-41e6-842a-446cac908eea\" (UID: \"464b1b3b-03a0-41e6-842a-446cac908eea\") " Nov 25 15:19:07 crc kubenswrapper[4965]: I1125 15:19:07.584722 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-bundle" (OuterVolumeSpecName: "bundle") pod "464b1b3b-03a0-41e6-842a-446cac908eea" (UID: "464b1b3b-03a0-41e6-842a-446cac908eea"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:19:07 crc kubenswrapper[4965]: I1125 15:19:07.584912 4965 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:19:07 crc kubenswrapper[4965]: I1125 15:19:07.595312 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/464b1b3b-03a0-41e6-842a-446cac908eea-kube-api-access-28dq9" (OuterVolumeSpecName: "kube-api-access-28dq9") pod "464b1b3b-03a0-41e6-842a-446cac908eea" (UID: "464b1b3b-03a0-41e6-842a-446cac908eea"). InnerVolumeSpecName "kube-api-access-28dq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:19:07 crc kubenswrapper[4965]: I1125 15:19:07.595917 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-util" (OuterVolumeSpecName: "util") pod "464b1b3b-03a0-41e6-842a-446cac908eea" (UID: "464b1b3b-03a0-41e6-842a-446cac908eea"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:19:07 crc kubenswrapper[4965]: I1125 15:19:07.685917 4965 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/464b1b3b-03a0-41e6-842a-446cac908eea-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:19:07 crc kubenswrapper[4965]: I1125 15:19:07.685946 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28dq9\" (UniqueName: \"kubernetes.io/projected/464b1b3b-03a0-41e6-842a-446cac908eea-kube-api-access-28dq9\") on node \"crc\" DevicePath \"\"" Nov 25 15:19:08 crc kubenswrapper[4965]: I1125 15:19:08.330231 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" event={"ID":"464b1b3b-03a0-41e6-842a-446cac908eea","Type":"ContainerDied","Data":"8df24488c13708abaa05fad812a9ff7ae540aa31552187bc3d18f7c44303b3d3"} Nov 25 15:19:08 crc kubenswrapper[4965]: I1125 15:19:08.330294 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8df24488c13708abaa05fad812a9ff7ae540aa31552187bc3d18f7c44303b3d3" Nov 25 15:19:08 crc kubenswrapper[4965]: I1125 15:19:08.330328 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.909509 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9"] Nov 25 15:19:16 crc kubenswrapper[4965]: E1125 15:19:16.910363 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="464b1b3b-03a0-41e6-842a-446cac908eea" containerName="pull" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.910379 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="464b1b3b-03a0-41e6-842a-446cac908eea" containerName="pull" Nov 25 15:19:16 crc kubenswrapper[4965]: E1125 15:19:16.910391 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="464b1b3b-03a0-41e6-842a-446cac908eea" containerName="extract" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.910399 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="464b1b3b-03a0-41e6-842a-446cac908eea" containerName="extract" Nov 25 15:19:16 crc kubenswrapper[4965]: E1125 15:19:16.910412 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" containerName="console" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.910419 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" containerName="console" Nov 25 15:19:16 crc kubenswrapper[4965]: E1125 15:19:16.910431 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="464b1b3b-03a0-41e6-842a-446cac908eea" containerName="util" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.910438 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="464b1b3b-03a0-41e6-842a-446cac908eea" containerName="util" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.910560 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="370d7098-0a4b-4aa0-8fb2-da7823f4d2d0" containerName="console" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.910578 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="464b1b3b-03a0-41e6-842a-446cac908eea" containerName="extract" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.911030 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.914488 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.914831 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.915017 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.915177 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.915515 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vdkx5" Nov 25 15:19:16 crc kubenswrapper[4965]: I1125 15:19:16.943672 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9"] Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.098642 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f37b40d-6b30-4f51-a5d0-4767877a0fa3-webhook-cert\") pod \"metallb-operator-controller-manager-6c8f57bc76-pdgb9\" (UID: \"7f37b40d-6b30-4f51-a5d0-4767877a0fa3\") " pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.098881 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f37b40d-6b30-4f51-a5d0-4767877a0fa3-apiservice-cert\") pod \"metallb-operator-controller-manager-6c8f57bc76-pdgb9\" (UID: \"7f37b40d-6b30-4f51-a5d0-4767877a0fa3\") " pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.099032 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkq79\" (UniqueName: \"kubernetes.io/projected/7f37b40d-6b30-4f51-a5d0-4767877a0fa3-kube-api-access-wkq79\") pod \"metallb-operator-controller-manager-6c8f57bc76-pdgb9\" (UID: \"7f37b40d-6b30-4f51-a5d0-4767877a0fa3\") " pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.199747 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f37b40d-6b30-4f51-a5d0-4767877a0fa3-webhook-cert\") pod \"metallb-operator-controller-manager-6c8f57bc76-pdgb9\" (UID: \"7f37b40d-6b30-4f51-a5d0-4767877a0fa3\") " pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.199811 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f37b40d-6b30-4f51-a5d0-4767877a0fa3-apiservice-cert\") pod \"metallb-operator-controller-manager-6c8f57bc76-pdgb9\" (UID: \"7f37b40d-6b30-4f51-a5d0-4767877a0fa3\") " pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.199846 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkq79\" (UniqueName: \"kubernetes.io/projected/7f37b40d-6b30-4f51-a5d0-4767877a0fa3-kube-api-access-wkq79\") pod \"metallb-operator-controller-manager-6c8f57bc76-pdgb9\" (UID: \"7f37b40d-6b30-4f51-a5d0-4767877a0fa3\") " pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.209773 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f37b40d-6b30-4f51-a5d0-4767877a0fa3-apiservice-cert\") pod \"metallb-operator-controller-manager-6c8f57bc76-pdgb9\" (UID: \"7f37b40d-6b30-4f51-a5d0-4767877a0fa3\") " pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.220099 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f37b40d-6b30-4f51-a5d0-4767877a0fa3-webhook-cert\") pod \"metallb-operator-controller-manager-6c8f57bc76-pdgb9\" (UID: \"7f37b40d-6b30-4f51-a5d0-4767877a0fa3\") " pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.243023 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkq79\" (UniqueName: \"kubernetes.io/projected/7f37b40d-6b30-4f51-a5d0-4767877a0fa3-kube-api-access-wkq79\") pod \"metallb-operator-controller-manager-6c8f57bc76-pdgb9\" (UID: \"7f37b40d-6b30-4f51-a5d0-4767877a0fa3\") " pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.256782 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm"] Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.262914 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.271830 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.272149 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.277752 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm"] Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.277897 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-mpbxz" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.302934 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcszg\" (UniqueName: \"kubernetes.io/projected/a611d5d0-f5a3-4e37-baeb-73104c16018a-kube-api-access-wcszg\") pod \"metallb-operator-webhook-server-5c874d5568-q92pm\" (UID: \"a611d5d0-f5a3-4e37-baeb-73104c16018a\") " pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.303340 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a611d5d0-f5a3-4e37-baeb-73104c16018a-apiservice-cert\") pod \"metallb-operator-webhook-server-5c874d5568-q92pm\" (UID: \"a611d5d0-f5a3-4e37-baeb-73104c16018a\") " pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.303369 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a611d5d0-f5a3-4e37-baeb-73104c16018a-webhook-cert\") pod \"metallb-operator-webhook-server-5c874d5568-q92pm\" (UID: \"a611d5d0-f5a3-4e37-baeb-73104c16018a\") " pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.404058 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcszg\" (UniqueName: \"kubernetes.io/projected/a611d5d0-f5a3-4e37-baeb-73104c16018a-kube-api-access-wcszg\") pod \"metallb-operator-webhook-server-5c874d5568-q92pm\" (UID: \"a611d5d0-f5a3-4e37-baeb-73104c16018a\") " pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.404535 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a611d5d0-f5a3-4e37-baeb-73104c16018a-apiservice-cert\") pod \"metallb-operator-webhook-server-5c874d5568-q92pm\" (UID: \"a611d5d0-f5a3-4e37-baeb-73104c16018a\") " pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.404763 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a611d5d0-f5a3-4e37-baeb-73104c16018a-webhook-cert\") pod \"metallb-operator-webhook-server-5c874d5568-q92pm\" (UID: \"a611d5d0-f5a3-4e37-baeb-73104c16018a\") " pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.408006 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a611d5d0-f5a3-4e37-baeb-73104c16018a-webhook-cert\") pod \"metallb-operator-webhook-server-5c874d5568-q92pm\" (UID: \"a611d5d0-f5a3-4e37-baeb-73104c16018a\") " pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.408388 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a611d5d0-f5a3-4e37-baeb-73104c16018a-apiservice-cert\") pod \"metallb-operator-webhook-server-5c874d5568-q92pm\" (UID: \"a611d5d0-f5a3-4e37-baeb-73104c16018a\") " pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.433668 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcszg\" (UniqueName: \"kubernetes.io/projected/a611d5d0-f5a3-4e37-baeb-73104c16018a-kube-api-access-wcszg\") pod \"metallb-operator-webhook-server-5c874d5568-q92pm\" (UID: \"a611d5d0-f5a3-4e37-baeb-73104c16018a\") " pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.526538 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:17 crc kubenswrapper[4965]: I1125 15:19:17.623816 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:18 crc kubenswrapper[4965]: W1125 15:19:18.070069 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda611d5d0_f5a3_4e37_baeb_73104c16018a.slice/crio-cee63c508d6e601bdc9b57da07b977bf417de58937ae922626ec64d59942c21f WatchSource:0}: Error finding container cee63c508d6e601bdc9b57da07b977bf417de58937ae922626ec64d59942c21f: Status 404 returned error can't find the container with id cee63c508d6e601bdc9b57da07b977bf417de58937ae922626ec64d59942c21f Nov 25 15:19:18 crc kubenswrapper[4965]: I1125 15:19:18.075828 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm"] Nov 25 15:19:18 crc kubenswrapper[4965]: I1125 15:19:18.123813 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9"] Nov 25 15:19:18 crc kubenswrapper[4965]: W1125 15:19:18.158418 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f37b40d_6b30_4f51_a5d0_4767877a0fa3.slice/crio-c926e240d079e02b5887fab780d9feae481f4bc278edbf19b9f1aaa5792348ed WatchSource:0}: Error finding container c926e240d079e02b5887fab780d9feae481f4bc278edbf19b9f1aaa5792348ed: Status 404 returned error can't find the container with id c926e240d079e02b5887fab780d9feae481f4bc278edbf19b9f1aaa5792348ed Nov 25 15:19:18 crc kubenswrapper[4965]: I1125 15:19:18.382566 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" event={"ID":"7f37b40d-6b30-4f51-a5d0-4767877a0fa3","Type":"ContainerStarted","Data":"c926e240d079e02b5887fab780d9feae481f4bc278edbf19b9f1aaa5792348ed"} Nov 25 15:19:18 crc kubenswrapper[4965]: I1125 15:19:18.383844 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" event={"ID":"a611d5d0-f5a3-4e37-baeb-73104c16018a","Type":"ContainerStarted","Data":"cee63c508d6e601bdc9b57da07b977bf417de58937ae922626ec64d59942c21f"} Nov 25 15:19:23 crc kubenswrapper[4965]: I1125 15:19:23.260910 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:19:23 crc kubenswrapper[4965]: I1125 15:19:23.261472 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:19:23 crc kubenswrapper[4965]: I1125 15:19:23.991861 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t7nzn"] Nov 25 15:19:23 crc kubenswrapper[4965]: I1125 15:19:23.993288 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:23 crc kubenswrapper[4965]: I1125 15:19:23.998317 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t7nzn"] Nov 25 15:19:24 crc kubenswrapper[4965]: I1125 15:19:24.115330 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-catalog-content\") pod \"community-operators-t7nzn\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:24 crc kubenswrapper[4965]: I1125 15:19:24.115682 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42j9t\" (UniqueName: \"kubernetes.io/projected/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-kube-api-access-42j9t\") pod \"community-operators-t7nzn\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:24 crc kubenswrapper[4965]: I1125 15:19:24.115733 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-utilities\") pod \"community-operators-t7nzn\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:24 crc kubenswrapper[4965]: I1125 15:19:24.216845 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-catalog-content\") pod \"community-operators-t7nzn\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:24 crc kubenswrapper[4965]: I1125 15:19:24.217154 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42j9t\" (UniqueName: \"kubernetes.io/projected/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-kube-api-access-42j9t\") pod \"community-operators-t7nzn\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:24 crc kubenswrapper[4965]: I1125 15:19:24.217289 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-utilities\") pod \"community-operators-t7nzn\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:24 crc kubenswrapper[4965]: I1125 15:19:24.217356 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-catalog-content\") pod \"community-operators-t7nzn\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:24 crc kubenswrapper[4965]: I1125 15:19:24.217637 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-utilities\") pod \"community-operators-t7nzn\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:24 crc kubenswrapper[4965]: I1125 15:19:24.237530 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42j9t\" (UniqueName: \"kubernetes.io/projected/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-kube-api-access-42j9t\") pod \"community-operators-t7nzn\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:24 crc kubenswrapper[4965]: I1125 15:19:24.351978 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:28 crc kubenswrapper[4965]: I1125 15:19:28.104386 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t7nzn"] Nov 25 15:19:28 crc kubenswrapper[4965]: I1125 15:19:28.442866 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7nzn" event={"ID":"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42","Type":"ContainerStarted","Data":"dc58a3205813a6fff70ffb0f6eb8b33b556fdebc1e8a31d8337d3e5e00d261c6"} Nov 25 15:19:28 crc kubenswrapper[4965]: I1125 15:19:28.445261 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" event={"ID":"a611d5d0-f5a3-4e37-baeb-73104c16018a","Type":"ContainerStarted","Data":"86acf8cf2437886951dbabe165a174de7c6d7cf4481091f03a170586a791b13b"} Nov 25 15:19:28 crc kubenswrapper[4965]: I1125 15:19:28.446836 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" event={"ID":"7f37b40d-6b30-4f51-a5d0-4767877a0fa3","Type":"ContainerStarted","Data":"8bf066d16417dc49458b013a5630cae300023ecc4746bd28426ea227a5cac280"} Nov 25 15:19:28 crc kubenswrapper[4965]: I1125 15:19:28.447002 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:19:28 crc kubenswrapper[4965]: I1125 15:19:28.479090 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" podStartSLOduration=2.898890504 podStartE2EDuration="12.479075322s" podCreationTimestamp="2025-11-25 15:19:16 +0000 UTC" firstStartedPulling="2025-11-25 15:19:18.162765924 +0000 UTC m=+903.130359670" lastFinishedPulling="2025-11-25 15:19:27.742950742 +0000 UTC m=+912.710544488" observedRunningTime="2025-11-25 15:19:28.473297255 +0000 UTC m=+913.440891001" watchObservedRunningTime="2025-11-25 15:19:28.479075322 +0000 UTC m=+913.446669068" Nov 25 15:19:29 crc kubenswrapper[4965]: I1125 15:19:29.455316 4965 generic.go:334] "Generic (PLEG): container finished" podID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerID="9883fe905e647b3c29ff2f90787cdc1266f0c13f2124c6d1eee3d8f8485f60fd" exitCode=0 Nov 25 15:19:29 crc kubenswrapper[4965]: I1125 15:19:29.455429 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7nzn" event={"ID":"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42","Type":"ContainerDied","Data":"9883fe905e647b3c29ff2f90787cdc1266f0c13f2124c6d1eee3d8f8485f60fd"} Nov 25 15:19:29 crc kubenswrapper[4965]: I1125 15:19:29.455792 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:29 crc kubenswrapper[4965]: I1125 15:19:29.493251 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" podStartSLOduration=2.758639866 podStartE2EDuration="12.493228714s" podCreationTimestamp="2025-11-25 15:19:17 +0000 UTC" firstStartedPulling="2025-11-25 15:19:18.073528937 +0000 UTC m=+903.041122683" lastFinishedPulling="2025-11-25 15:19:27.808117785 +0000 UTC m=+912.775711531" observedRunningTime="2025-11-25 15:19:29.490120879 +0000 UTC m=+914.457714655" watchObservedRunningTime="2025-11-25 15:19:29.493228714 +0000 UTC m=+914.460822460" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.504463 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7nzn" event={"ID":"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42","Type":"ContainerStarted","Data":"c15617b78a9179bd061317044ceada35c78eece09617ead2f073c0aac1e3fbe6"} Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.567673 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jktkw"] Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.569198 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.582865 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jktkw"] Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.606117 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-utilities\") pod \"redhat-marketplace-jktkw\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.606428 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq28l\" (UniqueName: \"kubernetes.io/projected/41bf1377-6d80-4353-8f44-cb9c20b2299c-kube-api-access-rq28l\") pod \"redhat-marketplace-jktkw\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.606546 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-catalog-content\") pod \"redhat-marketplace-jktkw\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.707829 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-utilities\") pod \"redhat-marketplace-jktkw\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.707918 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq28l\" (UniqueName: \"kubernetes.io/projected/41bf1377-6d80-4353-8f44-cb9c20b2299c-kube-api-access-rq28l\") pod \"redhat-marketplace-jktkw\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.707938 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-catalog-content\") pod \"redhat-marketplace-jktkw\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.708428 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-catalog-content\") pod \"redhat-marketplace-jktkw\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.708509 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-utilities\") pod \"redhat-marketplace-jktkw\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.726702 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq28l\" (UniqueName: \"kubernetes.io/projected/41bf1377-6d80-4353-8f44-cb9c20b2299c-kube-api-access-rq28l\") pod \"redhat-marketplace-jktkw\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:37 crc kubenswrapper[4965]: I1125 15:19:37.883187 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:38 crc kubenswrapper[4965]: I1125 15:19:38.321867 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jktkw"] Nov 25 15:19:38 crc kubenswrapper[4965]: W1125 15:19:38.332635 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41bf1377_6d80_4353_8f44_cb9c20b2299c.slice/crio-e83722858598f6f79a3b58d48e513687854c97ff567bf0322eff83352f87432d WatchSource:0}: Error finding container e83722858598f6f79a3b58d48e513687854c97ff567bf0322eff83352f87432d: Status 404 returned error can't find the container with id e83722858598f6f79a3b58d48e513687854c97ff567bf0322eff83352f87432d Nov 25 15:19:38 crc kubenswrapper[4965]: I1125 15:19:38.511443 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jktkw" event={"ID":"41bf1377-6d80-4353-8f44-cb9c20b2299c","Type":"ContainerStarted","Data":"e83722858598f6f79a3b58d48e513687854c97ff567bf0322eff83352f87432d"} Nov 25 15:19:38 crc kubenswrapper[4965]: I1125 15:19:38.515310 4965 generic.go:334] "Generic (PLEG): container finished" podID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerID="c15617b78a9179bd061317044ceada35c78eece09617ead2f073c0aac1e3fbe6" exitCode=0 Nov 25 15:19:38 crc kubenswrapper[4965]: I1125 15:19:38.515338 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7nzn" event={"ID":"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42","Type":"ContainerDied","Data":"c15617b78a9179bd061317044ceada35c78eece09617ead2f073c0aac1e3fbe6"} Nov 25 15:19:39 crc kubenswrapper[4965]: I1125 15:19:39.522092 4965 generic.go:334] "Generic (PLEG): container finished" podID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerID="6b0bf253a898b5095038ad9e87baa3663a4703dec6be5fc8dff9161dcb809f97" exitCode=0 Nov 25 15:19:39 crc kubenswrapper[4965]: I1125 15:19:39.522190 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jktkw" event={"ID":"41bf1377-6d80-4353-8f44-cb9c20b2299c","Type":"ContainerDied","Data":"6b0bf253a898b5095038ad9e87baa3663a4703dec6be5fc8dff9161dcb809f97"} Nov 25 15:19:41 crc kubenswrapper[4965]: I1125 15:19:41.536761 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7nzn" event={"ID":"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42","Type":"ContainerStarted","Data":"04a1ee5ef637a0c7e4b9efe1eb2b72108eb1ec3ffdf8cc4568b6bd93369badbb"} Nov 25 15:19:41 crc kubenswrapper[4965]: I1125 15:19:41.560959 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t7nzn" podStartSLOduration=7.866240696 podStartE2EDuration="18.560934695s" podCreationTimestamp="2025-11-25 15:19:23 +0000 UTC" firstStartedPulling="2025-11-25 15:19:29.458014066 +0000 UTC m=+914.425607802" lastFinishedPulling="2025-11-25 15:19:40.152708055 +0000 UTC m=+925.120301801" observedRunningTime="2025-11-25 15:19:41.555227339 +0000 UTC m=+926.522821095" watchObservedRunningTime="2025-11-25 15:19:41.560934695 +0000 UTC m=+926.528528441" Nov 25 15:19:44 crc kubenswrapper[4965]: I1125 15:19:44.352244 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:44 crc kubenswrapper[4965]: I1125 15:19:44.352823 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:44 crc kubenswrapper[4965]: I1125 15:19:44.413348 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:44 crc kubenswrapper[4965]: I1125 15:19:44.554534 4965 generic.go:334] "Generic (PLEG): container finished" podID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerID="a5ab1c4758e3204c023b4be6a9bfa757fcfffba7d2a161de3acb2d7ca2864c71" exitCode=0 Nov 25 15:19:44 crc kubenswrapper[4965]: I1125 15:19:44.555110 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jktkw" event={"ID":"41bf1377-6d80-4353-8f44-cb9c20b2299c","Type":"ContainerDied","Data":"a5ab1c4758e3204c023b4be6a9bfa757fcfffba7d2a161de3acb2d7ca2864c71"} Nov 25 15:19:47 crc kubenswrapper[4965]: I1125 15:19:47.580157 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jktkw" event={"ID":"41bf1377-6d80-4353-8f44-cb9c20b2299c","Type":"ContainerStarted","Data":"02deaf9b67440ef1a1dcf085c1323f0a7894507be77e5031fa210c43079a6038"} Nov 25 15:19:47 crc kubenswrapper[4965]: I1125 15:19:47.606269 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jktkw" podStartSLOduration=3.714269978 podStartE2EDuration="10.606245318s" podCreationTimestamp="2025-11-25 15:19:37 +0000 UTC" firstStartedPulling="2025-11-25 15:19:39.524249754 +0000 UTC m=+924.491843500" lastFinishedPulling="2025-11-25 15:19:46.416225094 +0000 UTC m=+931.383818840" observedRunningTime="2025-11-25 15:19:47.602743333 +0000 UTC m=+932.570337089" watchObservedRunningTime="2025-11-25 15:19:47.606245318 +0000 UTC m=+932.573839074" Nov 25 15:19:47 crc kubenswrapper[4965]: I1125 15:19:47.629069 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5c874d5568-q92pm" Nov 25 15:19:47 crc kubenswrapper[4965]: I1125 15:19:47.883803 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:47 crc kubenswrapper[4965]: I1125 15:19:47.883862 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:47 crc kubenswrapper[4965]: I1125 15:19:47.924658 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:53 crc kubenswrapper[4965]: I1125 15:19:53.260069 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:19:53 crc kubenswrapper[4965]: I1125 15:19:53.260675 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:19:54 crc kubenswrapper[4965]: I1125 15:19:54.394352 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:54 crc kubenswrapper[4965]: I1125 15:19:54.444121 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t7nzn"] Nov 25 15:19:54 crc kubenswrapper[4965]: I1125 15:19:54.620620 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t7nzn" podUID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerName="registry-server" containerID="cri-o://04a1ee5ef637a0c7e4b9efe1eb2b72108eb1ec3ffdf8cc4568b6bd93369badbb" gracePeriod=2 Nov 25 15:19:56 crc kubenswrapper[4965]: I1125 15:19:56.633404 4965 generic.go:334] "Generic (PLEG): container finished" podID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerID="04a1ee5ef637a0c7e4b9efe1eb2b72108eb1ec3ffdf8cc4568b6bd93369badbb" exitCode=0 Nov 25 15:19:56 crc kubenswrapper[4965]: I1125 15:19:56.633486 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7nzn" event={"ID":"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42","Type":"ContainerDied","Data":"04a1ee5ef637a0c7e4b9efe1eb2b72108eb1ec3ffdf8cc4568b6bd93369badbb"} Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.506532 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.624013 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42j9t\" (UniqueName: \"kubernetes.io/projected/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-kube-api-access-42j9t\") pod \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.624292 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-utilities\") pod \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.624335 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-catalog-content\") pod \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\" (UID: \"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42\") " Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.625216 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-utilities" (OuterVolumeSpecName: "utilities") pod "c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" (UID: "c6cb8b78-bd3b-4b21-8252-5e84f45a6f42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.631870 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-kube-api-access-42j9t" (OuterVolumeSpecName: "kube-api-access-42j9t") pod "c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" (UID: "c6cb8b78-bd3b-4b21-8252-5e84f45a6f42"). InnerVolumeSpecName "kube-api-access-42j9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.646292 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7nzn" event={"ID":"c6cb8b78-bd3b-4b21-8252-5e84f45a6f42","Type":"ContainerDied","Data":"dc58a3205813a6fff70ffb0f6eb8b33b556fdebc1e8a31d8337d3e5e00d261c6"} Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.647360 4965 scope.go:117] "RemoveContainer" containerID="04a1ee5ef637a0c7e4b9efe1eb2b72108eb1ec3ffdf8cc4568b6bd93369badbb" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.647326 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7nzn" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.671271 4965 scope.go:117] "RemoveContainer" containerID="c15617b78a9179bd061317044ceada35c78eece09617ead2f073c0aac1e3fbe6" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.687117 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" (UID: "c6cb8b78-bd3b-4b21-8252-5e84f45a6f42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.698060 4965 scope.go:117] "RemoveContainer" containerID="9883fe905e647b3c29ff2f90787cdc1266f0c13f2124c6d1eee3d8f8485f60fd" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.725919 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42j9t\" (UniqueName: \"kubernetes.io/projected/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-kube-api-access-42j9t\") on node \"crc\" DevicePath \"\"" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.725953 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.726134 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.988090 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t7nzn"] Nov 25 15:19:57 crc kubenswrapper[4965]: I1125 15:19:57.993516 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t7nzn"] Nov 25 15:19:58 crc kubenswrapper[4965]: I1125 15:19:58.007392 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:19:58 crc kubenswrapper[4965]: I1125 15:19:58.783506 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" path="/var/lib/kubelet/pods/c6cb8b78-bd3b-4b21-8252-5e84f45a6f42/volumes" Nov 25 15:20:00 crc kubenswrapper[4965]: I1125 15:20:00.225067 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jktkw"] Nov 25 15:20:00 crc kubenswrapper[4965]: I1125 15:20:00.225657 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jktkw" podUID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerName="registry-server" containerID="cri-o://02deaf9b67440ef1a1dcf085c1323f0a7894507be77e5031fa210c43079a6038" gracePeriod=2 Nov 25 15:20:00 crc kubenswrapper[4965]: I1125 15:20:00.663418 4965 generic.go:334] "Generic (PLEG): container finished" podID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerID="02deaf9b67440ef1a1dcf085c1323f0a7894507be77e5031fa210c43079a6038" exitCode=0 Nov 25 15:20:00 crc kubenswrapper[4965]: I1125 15:20:00.663465 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jktkw" event={"ID":"41bf1377-6d80-4353-8f44-cb9c20b2299c","Type":"ContainerDied","Data":"02deaf9b67440ef1a1dcf085c1323f0a7894507be77e5031fa210c43079a6038"} Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.369572 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.481870 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq28l\" (UniqueName: \"kubernetes.io/projected/41bf1377-6d80-4353-8f44-cb9c20b2299c-kube-api-access-rq28l\") pod \"41bf1377-6d80-4353-8f44-cb9c20b2299c\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.481962 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-catalog-content\") pod \"41bf1377-6d80-4353-8f44-cb9c20b2299c\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.482076 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-utilities\") pod \"41bf1377-6d80-4353-8f44-cb9c20b2299c\" (UID: \"41bf1377-6d80-4353-8f44-cb9c20b2299c\") " Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.482871 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-utilities" (OuterVolumeSpecName: "utilities") pod "41bf1377-6d80-4353-8f44-cb9c20b2299c" (UID: "41bf1377-6d80-4353-8f44-cb9c20b2299c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.526740 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41bf1377-6d80-4353-8f44-cb9c20b2299c-kube-api-access-rq28l" (OuterVolumeSpecName: "kube-api-access-rq28l") pod "41bf1377-6d80-4353-8f44-cb9c20b2299c" (UID: "41bf1377-6d80-4353-8f44-cb9c20b2299c"). InnerVolumeSpecName "kube-api-access-rq28l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.533429 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41bf1377-6d80-4353-8f44-cb9c20b2299c" (UID: "41bf1377-6d80-4353-8f44-cb9c20b2299c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.583020 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq28l\" (UniqueName: \"kubernetes.io/projected/41bf1377-6d80-4353-8f44-cb9c20b2299c-kube-api-access-rq28l\") on node \"crc\" DevicePath \"\"" Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.583052 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.583064 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41bf1377-6d80-4353-8f44-cb9c20b2299c-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.671091 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jktkw" event={"ID":"41bf1377-6d80-4353-8f44-cb9c20b2299c","Type":"ContainerDied","Data":"e83722858598f6f79a3b58d48e513687854c97ff567bf0322eff83352f87432d"} Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.671137 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jktkw" Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.671152 4965 scope.go:117] "RemoveContainer" containerID="02deaf9b67440ef1a1dcf085c1323f0a7894507be77e5031fa210c43079a6038" Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.690513 4965 scope.go:117] "RemoveContainer" containerID="a5ab1c4758e3204c023b4be6a9bfa757fcfffba7d2a161de3acb2d7ca2864c71" Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.696315 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jktkw"] Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.700834 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jktkw"] Nov 25 15:20:01 crc kubenswrapper[4965]: I1125 15:20:01.712378 4965 scope.go:117] "RemoveContainer" containerID="6b0bf253a898b5095038ad9e87baa3663a4703dec6be5fc8dff9161dcb809f97" Nov 25 15:20:02 crc kubenswrapper[4965]: I1125 15:20:02.780417 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41bf1377-6d80-4353-8f44-cb9c20b2299c" path="/var/lib/kubelet/pods/41bf1377-6d80-4353-8f44-cb9c20b2299c/volumes" Nov 25 15:20:07 crc kubenswrapper[4965]: I1125 15:20:07.532602 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6c8f57bc76-pdgb9" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.319528 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5vf7m"] Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.319760 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerName="extract-utilities" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.319772 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerName="extract-utilities" Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.319780 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerName="extract-content" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.319786 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerName="extract-content" Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.319796 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerName="registry-server" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.319802 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerName="registry-server" Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.319815 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerName="registry-server" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.319820 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerName="registry-server" Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.319848 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerName="extract-utilities" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.319856 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerName="extract-utilities" Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.319867 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerName="extract-content" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.319873 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerName="extract-content" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.320002 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6cb8b78-bd3b-4b21-8252-5e84f45a6f42" containerName="registry-server" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.320014 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="41bf1377-6d80-4353-8f44-cb9c20b2299c" containerName="registry-server" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.327915 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.336334 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.336576 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-fcm2n" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.336350 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.368039 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr"] Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.369138 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.373264 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.391803 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr"] Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.535687 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-frr-startup\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.535762 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwmk\" (UniqueName: \"kubernetes.io/projected/4ed3f916-aee7-4d42-b704-bf4e22789ce0-kube-api-access-crwmk\") pod \"frr-k8s-webhook-server-6998585d5-sbxmr\" (UID: \"4ed3f916-aee7-4d42-b704-bf4e22789ce0\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.536153 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-frr-sockets\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.536256 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ed3f916-aee7-4d42-b704-bf4e22789ce0-cert\") pod \"frr-k8s-webhook-server-6998585d5-sbxmr\" (UID: \"4ed3f916-aee7-4d42-b704-bf4e22789ce0\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.536302 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-metrics-certs\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.536334 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdtfk\" (UniqueName: \"kubernetes.io/projected/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-kube-api-access-tdtfk\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.536415 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-metrics\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.536488 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-reloader\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.536555 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-frr-conf\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.563029 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-4pfqd"] Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.564079 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.567758 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.581738 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-j2bqm"] Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.582590 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.585907 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.586178 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.587885 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.587959 4965 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xg9mf" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.591758 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-4pfqd"] Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.638611 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-metrics\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.638884 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-reloader\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.638985 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-metrics-certs\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.638922 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-metrics\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639078 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clqhn\" (UniqueName: \"kubernetes.io/projected/41064ec4-3f9f-481d-8b5f-695a592ec58d-kube-api-access-clqhn\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639147 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-reloader\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639152 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-frr-conf\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639221 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-frr-startup\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639265 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crwmk\" (UniqueName: \"kubernetes.io/projected/4ed3f916-aee7-4d42-b704-bf4e22789ce0-kube-api-access-crwmk\") pod \"frr-k8s-webhook-server-6998585d5-sbxmr\" (UID: \"4ed3f916-aee7-4d42-b704-bf4e22789ce0\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639303 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-frr-sockets\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639335 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-memberlist\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639362 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-cert\") pod \"controller-6c7b4b5f48-4pfqd\" (UID: \"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73\") " pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639385 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-metrics-certs\") pod \"controller-6c7b4b5f48-4pfqd\" (UID: \"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73\") " pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639411 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ed3f916-aee7-4d42-b704-bf4e22789ce0-cert\") pod \"frr-k8s-webhook-server-6998585d5-sbxmr\" (UID: \"4ed3f916-aee7-4d42-b704-bf4e22789ce0\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639430 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/41064ec4-3f9f-481d-8b5f-695a592ec58d-metallb-excludel2\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639457 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-metrics-certs\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639482 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdtfk\" (UniqueName: \"kubernetes.io/projected/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-kube-api-access-tdtfk\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639517 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhl5q\" (UniqueName: \"kubernetes.io/projected/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-kube-api-access-zhl5q\") pod \"controller-6c7b4b5f48-4pfqd\" (UID: \"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73\") " pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639304 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-frr-conf\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.639600 4965 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.639637 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ed3f916-aee7-4d42-b704-bf4e22789ce0-cert podName:4ed3f916-aee7-4d42-b704-bf4e22789ce0 nodeName:}" failed. No retries permitted until 2025-11-25 15:20:09.139622816 +0000 UTC m=+954.107216562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4ed3f916-aee7-4d42-b704-bf4e22789ce0-cert") pod "frr-k8s-webhook-server-6998585d5-sbxmr" (UID: "4ed3f916-aee7-4d42-b704-bf4e22789ce0") : secret "frr-k8s-webhook-server-cert" not found Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.639681 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-frr-sockets\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.640166 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-frr-startup\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.645248 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-metrics-certs\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.665047 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwmk\" (UniqueName: \"kubernetes.io/projected/4ed3f916-aee7-4d42-b704-bf4e22789ce0-kube-api-access-crwmk\") pod \"frr-k8s-webhook-server-6998585d5-sbxmr\" (UID: \"4ed3f916-aee7-4d42-b704-bf4e22789ce0\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.676644 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdtfk\" (UniqueName: \"kubernetes.io/projected/ae8db73e-12e4-40b7-8d6b-d44b36b79b46-kube-api-access-tdtfk\") pod \"frr-k8s-5vf7m\" (UID: \"ae8db73e-12e4-40b7-8d6b-d44b36b79b46\") " pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.740092 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-memberlist\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.740130 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-cert\") pod \"controller-6c7b4b5f48-4pfqd\" (UID: \"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73\") " pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.740146 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-metrics-certs\") pod \"controller-6c7b4b5f48-4pfqd\" (UID: \"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73\") " pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.740193 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/41064ec4-3f9f-481d-8b5f-695a592ec58d-metallb-excludel2\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.740224 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhl5q\" (UniqueName: \"kubernetes.io/projected/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-kube-api-access-zhl5q\") pod \"controller-6c7b4b5f48-4pfqd\" (UID: \"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73\") " pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.740246 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-metrics-certs\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.740263 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clqhn\" (UniqueName: \"kubernetes.io/projected/41064ec4-3f9f-481d-8b5f-695a592ec58d-kube-api-access-clqhn\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.740604 4965 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.740642 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-memberlist podName:41064ec4-3f9f-481d-8b5f-695a592ec58d nodeName:}" failed. No retries permitted until 2025-11-25 15:20:09.240630323 +0000 UTC m=+954.208224069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-memberlist") pod "speaker-j2bqm" (UID: "41064ec4-3f9f-481d-8b5f-695a592ec58d") : secret "metallb-memberlist" not found Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.741201 4965 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Nov 25 15:20:08 crc kubenswrapper[4965]: E1125 15:20:08.741232 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-metrics-certs podName:8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73 nodeName:}" failed. No retries permitted until 2025-11-25 15:20:09.241224079 +0000 UTC m=+954.208817825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-metrics-certs") pod "controller-6c7b4b5f48-4pfqd" (UID: "8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73") : secret "controller-certs-secret" not found Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.741813 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/41064ec4-3f9f-481d-8b5f-695a592ec58d-metallb-excludel2\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.749039 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-cert\") pod \"controller-6c7b4b5f48-4pfqd\" (UID: \"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73\") " pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.752678 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-metrics-certs\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.768474 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhl5q\" (UniqueName: \"kubernetes.io/projected/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-kube-api-access-zhl5q\") pod \"controller-6c7b4b5f48-4pfqd\" (UID: \"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73\") " pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.774268 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clqhn\" (UniqueName: \"kubernetes.io/projected/41064ec4-3f9f-481d-8b5f-695a592ec58d-kube-api-access-clqhn\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:08 crc kubenswrapper[4965]: I1125 15:20:08.960532 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.145640 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ed3f916-aee7-4d42-b704-bf4e22789ce0-cert\") pod \"frr-k8s-webhook-server-6998585d5-sbxmr\" (UID: \"4ed3f916-aee7-4d42-b704-bf4e22789ce0\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.152010 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ed3f916-aee7-4d42-b704-bf4e22789ce0-cert\") pod \"frr-k8s-webhook-server-6998585d5-sbxmr\" (UID: \"4ed3f916-aee7-4d42-b704-bf4e22789ce0\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.246572 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-memberlist\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.246621 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-metrics-certs\") pod \"controller-6c7b4b5f48-4pfqd\" (UID: \"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73\") " pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:09 crc kubenswrapper[4965]: E1125 15:20:09.246743 4965 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 25 15:20:09 crc kubenswrapper[4965]: E1125 15:20:09.246812 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-memberlist podName:41064ec4-3f9f-481d-8b5f-695a592ec58d nodeName:}" failed. No retries permitted until 2025-11-25 15:20:10.246795709 +0000 UTC m=+955.214389455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-memberlist") pod "speaker-j2bqm" (UID: "41064ec4-3f9f-481d-8b5f-695a592ec58d") : secret "metallb-memberlist" not found Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.249644 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73-metrics-certs\") pod \"controller-6c7b4b5f48-4pfqd\" (UID: \"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73\") " pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.317103 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.476854 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:09 crc kubenswrapper[4965]: W1125 15:20:09.560638 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ed3f916_aee7_4d42_b704_bf4e22789ce0.slice/crio-6a15c954890cc6dca192058407b67bc7bb53b8fd04301d3c18dcac315bfe2217 WatchSource:0}: Error finding container 6a15c954890cc6dca192058407b67bc7bb53b8fd04301d3c18dcac315bfe2217: Status 404 returned error can't find the container with id 6a15c954890cc6dca192058407b67bc7bb53b8fd04301d3c18dcac315bfe2217 Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.563318 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr"] Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.710647 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-4pfqd"] Nov 25 15:20:09 crc kubenswrapper[4965]: W1125 15:20:09.715545 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fa594fd_579e_4f16_8ebc_3ecb8ec0ab73.slice/crio-59dc09d585c96de6ba4d7371c7c1e34da00505c7683296a699b69f8c5bbddfaf WatchSource:0}: Error finding container 59dc09d585c96de6ba4d7371c7c1e34da00505c7683296a699b69f8c5bbddfaf: Status 404 returned error can't find the container with id 59dc09d585c96de6ba4d7371c7c1e34da00505c7683296a699b69f8c5bbddfaf Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.725425 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-4pfqd" event={"ID":"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73","Type":"ContainerStarted","Data":"59dc09d585c96de6ba4d7371c7c1e34da00505c7683296a699b69f8c5bbddfaf"} Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.726503 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerStarted","Data":"f696960f323c84b13917918a4c984f5a3fedc2459804103bd85debf7c25d4869"} Nov 25 15:20:09 crc kubenswrapper[4965]: I1125 15:20:09.727514 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" event={"ID":"4ed3f916-aee7-4d42-b704-bf4e22789ce0","Type":"ContainerStarted","Data":"6a15c954890cc6dca192058407b67bc7bb53b8fd04301d3c18dcac315bfe2217"} Nov 25 15:20:10 crc kubenswrapper[4965]: I1125 15:20:10.262922 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-memberlist\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:10 crc kubenswrapper[4965]: I1125 15:20:10.268276 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/41064ec4-3f9f-481d-8b5f-695a592ec58d-memberlist\") pod \"speaker-j2bqm\" (UID: \"41064ec4-3f9f-481d-8b5f-695a592ec58d\") " pod="metallb-system/speaker-j2bqm" Nov 25 15:20:10 crc kubenswrapper[4965]: I1125 15:20:10.395541 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-j2bqm" Nov 25 15:20:10 crc kubenswrapper[4965]: I1125 15:20:10.733787 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-4pfqd" event={"ID":"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73","Type":"ContainerStarted","Data":"5d5e1bcd5b40109c748d817ef4977585aa501d26c28a339a10136b411c506faa"} Nov 25 15:20:10 crc kubenswrapper[4965]: I1125 15:20:10.733826 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-4pfqd" event={"ID":"8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73","Type":"ContainerStarted","Data":"2b2f0d6c22353afa41c08eb74c1b3508b8ae0f3d6afadcc2a2b3407a73d66fd3"} Nov 25 15:20:10 crc kubenswrapper[4965]: I1125 15:20:10.734691 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:10 crc kubenswrapper[4965]: I1125 15:20:10.736277 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j2bqm" event={"ID":"41064ec4-3f9f-481d-8b5f-695a592ec58d","Type":"ContainerStarted","Data":"46d7b9ad5ba84374bc225cff0ef58ec542865a754bdf2064c949b2416404ff63"} Nov 25 15:20:11 crc kubenswrapper[4965]: I1125 15:20:11.746029 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j2bqm" event={"ID":"41064ec4-3f9f-481d-8b5f-695a592ec58d","Type":"ContainerStarted","Data":"78369c998d92e5573dabd5fcc9f34c6cd5c01ad51caed4ce45577d33bd60436e"} Nov 25 15:20:11 crc kubenswrapper[4965]: I1125 15:20:11.746410 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j2bqm" event={"ID":"41064ec4-3f9f-481d-8b5f-695a592ec58d","Type":"ContainerStarted","Data":"3d18e856449115f5be9b04a5d288a008372ab324e5ee8219003e79a4b6f7d21a"} Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.721307 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-4pfqd" podStartSLOduration=4.721281033 podStartE2EDuration="4.721281033s" podCreationTimestamp="2025-11-25 15:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:20:10.775070342 +0000 UTC m=+955.742664108" watchObservedRunningTime="2025-11-25 15:20:12.721281033 +0000 UTC m=+957.688874789" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.723615 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hv4f2"] Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.724797 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.729186 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hv4f2"] Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.758475 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-j2bqm" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.792735 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-j2bqm" podStartSLOduration=4.792713016 podStartE2EDuration="4.792713016s" podCreationTimestamp="2025-11-25 15:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:20:12.779328202 +0000 UTC m=+957.746921948" watchObservedRunningTime="2025-11-25 15:20:12.792713016 +0000 UTC m=+957.760306762" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.813042 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-utilities\") pod \"certified-operators-hv4f2\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.813099 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzm9j\" (UniqueName: \"kubernetes.io/projected/061157a9-59f8-48d9-b892-af578dd09f37-kube-api-access-zzm9j\") pod \"certified-operators-hv4f2\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.813186 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-catalog-content\") pod \"certified-operators-hv4f2\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.914901 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-utilities\") pod \"certified-operators-hv4f2\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.914948 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzm9j\" (UniqueName: \"kubernetes.io/projected/061157a9-59f8-48d9-b892-af578dd09f37-kube-api-access-zzm9j\") pod \"certified-operators-hv4f2\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.915009 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-catalog-content\") pod \"certified-operators-hv4f2\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.915478 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-catalog-content\") pod \"certified-operators-hv4f2\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.915680 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-utilities\") pod \"certified-operators-hv4f2\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:12 crc kubenswrapper[4965]: I1125 15:20:12.938595 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzm9j\" (UniqueName: \"kubernetes.io/projected/061157a9-59f8-48d9-b892-af578dd09f37-kube-api-access-zzm9j\") pod \"certified-operators-hv4f2\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:13 crc kubenswrapper[4965]: I1125 15:20:13.044302 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:13 crc kubenswrapper[4965]: I1125 15:20:13.595251 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hv4f2"] Nov 25 15:20:13 crc kubenswrapper[4965]: W1125 15:20:13.607111 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod061157a9_59f8_48d9_b892_af578dd09f37.slice/crio-4148126a62d30f12f9d7133fd388271b5af8e8e50fef301eeeb53e093133beaf WatchSource:0}: Error finding container 4148126a62d30f12f9d7133fd388271b5af8e8e50fef301eeeb53e093133beaf: Status 404 returned error can't find the container with id 4148126a62d30f12f9d7133fd388271b5af8e8e50fef301eeeb53e093133beaf Nov 25 15:20:13 crc kubenswrapper[4965]: I1125 15:20:13.763267 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hv4f2" event={"ID":"061157a9-59f8-48d9-b892-af578dd09f37","Type":"ContainerStarted","Data":"4148126a62d30f12f9d7133fd388271b5af8e8e50fef301eeeb53e093133beaf"} Nov 25 15:20:14 crc kubenswrapper[4965]: I1125 15:20:14.776494 4965 generic.go:334] "Generic (PLEG): container finished" podID="061157a9-59f8-48d9-b892-af578dd09f37" containerID="0cadeed6511f26ab600dee17f4fc112240cd28eec89dcf8176a0f2cc0888f667" exitCode=0 Nov 25 15:20:14 crc kubenswrapper[4965]: I1125 15:20:14.784586 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hv4f2" event={"ID":"061157a9-59f8-48d9-b892-af578dd09f37","Type":"ContainerDied","Data":"0cadeed6511f26ab600dee17f4fc112240cd28eec89dcf8176a0f2cc0888f667"} Nov 25 15:20:19 crc kubenswrapper[4965]: I1125 15:20:19.483406 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-4pfqd" Nov 25 15:20:20 crc kubenswrapper[4965]: I1125 15:20:20.399218 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-j2bqm" Nov 25 15:20:23 crc kubenswrapper[4965]: I1125 15:20:23.211853 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerStarted","Data":"f394f54b1ded7955a946ad8c5c76bc4617b17a95ed62e412932cb1638cd46faa"} Nov 25 15:20:23 crc kubenswrapper[4965]: I1125 15:20:23.259953 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:20:23 crc kubenswrapper[4965]: I1125 15:20:23.260062 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:20:23 crc kubenswrapper[4965]: I1125 15:20:23.260108 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:20:23 crc kubenswrapper[4965]: I1125 15:20:23.260582 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"00ca3c30c6342c0ded628729d3f70a02171e1d4a4c62216224c37d3f6ce21240"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:20:23 crc kubenswrapper[4965]: I1125 15:20:23.260655 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://00ca3c30c6342c0ded628729d3f70a02171e1d4a4c62216224c37d3f6ce21240" gracePeriod=600 Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.107409 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-t6t8g"] Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.108609 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t6t8g" Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.111314 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-flsq8" Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.112435 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.113530 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.127002 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t6t8g"] Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.171446 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkbnz\" (UniqueName: \"kubernetes.io/projected/fe0fcbae-0cbb-4dc6-972b-106531288f4b-kube-api-access-kkbnz\") pod \"openstack-operator-index-t6t8g\" (UID: \"fe0fcbae-0cbb-4dc6-972b-106531288f4b\") " pod="openstack-operators/openstack-operator-index-t6t8g" Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.218111 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" event={"ID":"4ed3f916-aee7-4d42-b704-bf4e22789ce0","Type":"ContainerStarted","Data":"740687ec9368aebfdde09dabae4a4a7b239bac48521ec16003d8c635731c6582"} Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.220549 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="00ca3c30c6342c0ded628729d3f70a02171e1d4a4c62216224c37d3f6ce21240" exitCode=0 Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.220657 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"00ca3c30c6342c0ded628729d3f70a02171e1d4a4c62216224c37d3f6ce21240"} Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.220722 4965 scope.go:117] "RemoveContainer" containerID="99a3ddb14dfc84a3500a205d74675321fc95f75084490879b200cf9441df58f4" Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.222176 4965 generic.go:334] "Generic (PLEG): container finished" podID="ae8db73e-12e4-40b7-8d6b-d44b36b79b46" containerID="f394f54b1ded7955a946ad8c5c76bc4617b17a95ed62e412932cb1638cd46faa" exitCode=0 Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.222255 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerDied","Data":"f394f54b1ded7955a946ad8c5c76bc4617b17a95ed62e412932cb1638cd46faa"} Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.225736 4965 generic.go:334] "Generic (PLEG): container finished" podID="061157a9-59f8-48d9-b892-af578dd09f37" containerID="4c0f1282c255db6ad5f87092925bbcbfcc20e87ba04a9378048b85a11c39eff9" exitCode=0 Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.225777 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hv4f2" event={"ID":"061157a9-59f8-48d9-b892-af578dd09f37","Type":"ContainerDied","Data":"4c0f1282c255db6ad5f87092925bbcbfcc20e87ba04a9378048b85a11c39eff9"} Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.242384 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" podStartSLOduration=3.252838786 podStartE2EDuration="16.242367009s" podCreationTimestamp="2025-11-25 15:20:08 +0000 UTC" firstStartedPulling="2025-11-25 15:20:09.563263636 +0000 UTC m=+954.530857382" lastFinishedPulling="2025-11-25 15:20:22.552791849 +0000 UTC m=+967.520385605" observedRunningTime="2025-11-25 15:20:24.24165833 +0000 UTC m=+969.209252076" watchObservedRunningTime="2025-11-25 15:20:24.242367009 +0000 UTC m=+969.209960755" Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.272343 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkbnz\" (UniqueName: \"kubernetes.io/projected/fe0fcbae-0cbb-4dc6-972b-106531288f4b-kube-api-access-kkbnz\") pod \"openstack-operator-index-t6t8g\" (UID: \"fe0fcbae-0cbb-4dc6-972b-106531288f4b\") " pod="openstack-operators/openstack-operator-index-t6t8g" Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.309592 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkbnz\" (UniqueName: \"kubernetes.io/projected/fe0fcbae-0cbb-4dc6-972b-106531288f4b-kube-api-access-kkbnz\") pod \"openstack-operator-index-t6t8g\" (UID: \"fe0fcbae-0cbb-4dc6-972b-106531288f4b\") " pod="openstack-operators/openstack-operator-index-t6t8g" Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.431927 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t6t8g" Nov 25 15:20:24 crc kubenswrapper[4965]: I1125 15:20:24.646921 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t6t8g"] Nov 25 15:20:25 crc kubenswrapper[4965]: I1125 15:20:25.234024 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"be8cabf8c298dce6dc5c47e109690923bbdb10ab8f0bdbfa1738209ba0e27a1b"} Nov 25 15:20:25 crc kubenswrapper[4965]: I1125 15:20:25.236200 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t6t8g" event={"ID":"fe0fcbae-0cbb-4dc6-972b-106531288f4b","Type":"ContainerStarted","Data":"902751543f5fa14675eeaa10379159bb3219d178d1e5b8d58caab84ea926501f"} Nov 25 15:20:25 crc kubenswrapper[4965]: I1125 15:20:25.237796 4965 generic.go:334] "Generic (PLEG): container finished" podID="ae8db73e-12e4-40b7-8d6b-d44b36b79b46" containerID="77f27301d56a9d8a169e239075a6b21f7a2994c32e683bd56495640d55e777b3" exitCode=0 Nov 25 15:20:25 crc kubenswrapper[4965]: I1125 15:20:25.238533 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerDied","Data":"77f27301d56a9d8a169e239075a6b21f7a2994c32e683bd56495640d55e777b3"} Nov 25 15:20:25 crc kubenswrapper[4965]: I1125 15:20:25.238569 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:26 crc kubenswrapper[4965]: I1125 15:20:26.250047 4965 generic.go:334] "Generic (PLEG): container finished" podID="ae8db73e-12e4-40b7-8d6b-d44b36b79b46" containerID="f1a4b6fe97701003ec905609d015f625094637dbb20b2a8ef935f8fb3c8f7c7b" exitCode=0 Nov 25 15:20:26 crc kubenswrapper[4965]: I1125 15:20:26.250165 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerDied","Data":"f1a4b6fe97701003ec905609d015f625094637dbb20b2a8ef935f8fb3c8f7c7b"} Nov 25 15:20:27 crc kubenswrapper[4965]: I1125 15:20:27.259189 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerStarted","Data":"948da384ae65b14e2944bec65e0c078d9389510ba454ae35c2d1505a8f343b32"} Nov 25 15:20:27 crc kubenswrapper[4965]: I1125 15:20:27.261458 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hv4f2" event={"ID":"061157a9-59f8-48d9-b892-af578dd09f37","Type":"ContainerStarted","Data":"edfb30c2d2161f2670cbc93a6a47e0e626b4a323d88b8dedcf3ae0baf7689f1d"} Nov 25 15:20:28 crc kubenswrapper[4965]: I1125 15:20:28.058005 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hv4f2" podStartSLOduration=4.256181414 podStartE2EDuration="16.057986876s" podCreationTimestamp="2025-11-25 15:20:12 +0000 UTC" firstStartedPulling="2025-11-25 15:20:14.778509932 +0000 UTC m=+959.746103678" lastFinishedPulling="2025-11-25 15:20:26.580315394 +0000 UTC m=+971.547909140" observedRunningTime="2025-11-25 15:20:27.285294587 +0000 UTC m=+972.252888353" watchObservedRunningTime="2025-11-25 15:20:28.057986876 +0000 UTC m=+973.025580612" Nov 25 15:20:28 crc kubenswrapper[4965]: I1125 15:20:28.061812 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-t6t8g"] Nov 25 15:20:29 crc kubenswrapper[4965]: I1125 15:20:29.054658 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xx52h"] Nov 25 15:20:29 crc kubenswrapper[4965]: I1125 15:20:29.055798 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xx52h" Nov 25 15:20:29 crc kubenswrapper[4965]: I1125 15:20:29.085686 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xx52h"] Nov 25 15:20:29 crc kubenswrapper[4965]: I1125 15:20:29.154598 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffdks\" (UniqueName: \"kubernetes.io/projected/c4e856d6-2a93-44e9-81c8-c965842a65d9-kube-api-access-ffdks\") pod \"openstack-operator-index-xx52h\" (UID: \"c4e856d6-2a93-44e9-81c8-c965842a65d9\") " pod="openstack-operators/openstack-operator-index-xx52h" Nov 25 15:20:29 crc kubenswrapper[4965]: I1125 15:20:29.256252 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffdks\" (UniqueName: \"kubernetes.io/projected/c4e856d6-2a93-44e9-81c8-c965842a65d9-kube-api-access-ffdks\") pod \"openstack-operator-index-xx52h\" (UID: \"c4e856d6-2a93-44e9-81c8-c965842a65d9\") " pod="openstack-operators/openstack-operator-index-xx52h" Nov 25 15:20:29 crc kubenswrapper[4965]: I1125 15:20:29.287315 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerStarted","Data":"89dbc84c6bd23a27bdf817a3864b0ae2fc9e7669877b0919d42a1833403441bd"} Nov 25 15:20:29 crc kubenswrapper[4965]: I1125 15:20:29.295752 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffdks\" (UniqueName: \"kubernetes.io/projected/c4e856d6-2a93-44e9-81c8-c965842a65d9-kube-api-access-ffdks\") pod \"openstack-operator-index-xx52h\" (UID: \"c4e856d6-2a93-44e9-81c8-c965842a65d9\") " pod="openstack-operators/openstack-operator-index-xx52h" Nov 25 15:20:29 crc kubenswrapper[4965]: I1125 15:20:29.373865 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xx52h" Nov 25 15:20:29 crc kubenswrapper[4965]: W1125 15:20:29.817205 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4e856d6_2a93_44e9_81c8_c965842a65d9.slice/crio-dcb5fa1fe212c9706b37627c6fab92f5a5ef5e311cf88b791865f89b7a5213f0 WatchSource:0}: Error finding container dcb5fa1fe212c9706b37627c6fab92f5a5ef5e311cf88b791865f89b7a5213f0: Status 404 returned error can't find the container with id dcb5fa1fe212c9706b37627c6fab92f5a5ef5e311cf88b791865f89b7a5213f0 Nov 25 15:20:29 crc kubenswrapper[4965]: I1125 15:20:29.817864 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xx52h"] Nov 25 15:20:30 crc kubenswrapper[4965]: I1125 15:20:30.292330 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xx52h" event={"ID":"c4e856d6-2a93-44e9-81c8-c965842a65d9","Type":"ContainerStarted","Data":"dcb5fa1fe212c9706b37627c6fab92f5a5ef5e311cf88b791865f89b7a5213f0"} Nov 25 15:20:30 crc kubenswrapper[4965]: I1125 15:20:30.295414 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerStarted","Data":"a8b1c920f6a2f53303a8acb7a742880d6b46db46ac46689e69ee4ee4f6e87e49"} Nov 25 15:20:31 crc kubenswrapper[4965]: I1125 15:20:31.303811 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerStarted","Data":"374d3212bdcbd8622192b43154628af838be51dc598f70d647230e0c88fbd0ab"} Nov 25 15:20:32 crc kubenswrapper[4965]: I1125 15:20:32.313547 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerStarted","Data":"c4e393a59947750a3a1c19c1148cbcb56ba77254e8b691757574604a88bd33ec"} Nov 25 15:20:33 crc kubenswrapper[4965]: I1125 15:20:33.045436 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:33 crc kubenswrapper[4965]: I1125 15:20:33.045481 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:33 crc kubenswrapper[4965]: I1125 15:20:33.092753 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:33 crc kubenswrapper[4965]: I1125 15:20:33.326403 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vf7m" event={"ID":"ae8db73e-12e4-40b7-8d6b-d44b36b79b46","Type":"ContainerStarted","Data":"a1f272c1207293ffe7db8a1367a29b1ace53bede3dab5188001114a4727d81fe"} Nov 25 15:20:33 crc kubenswrapper[4965]: I1125 15:20:33.355791 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5vf7m" podStartSLOduration=11.986832206 podStartE2EDuration="25.355773658s" podCreationTimestamp="2025-11-25 15:20:08 +0000 UTC" firstStartedPulling="2025-11-25 15:20:09.186594971 +0000 UTC m=+954.154188717" lastFinishedPulling="2025-11-25 15:20:22.555536383 +0000 UTC m=+967.523130169" observedRunningTime="2025-11-25 15:20:33.346214349 +0000 UTC m=+978.313808125" watchObservedRunningTime="2025-11-25 15:20:33.355773658 +0000 UTC m=+978.323367424" Nov 25 15:20:33 crc kubenswrapper[4965]: I1125 15:20:33.370138 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:33 crc kubenswrapper[4965]: I1125 15:20:33.961836 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:34 crc kubenswrapper[4965]: I1125 15:20:34.002703 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:34 crc kubenswrapper[4965]: I1125 15:20:34.332839 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:36 crc kubenswrapper[4965]: I1125 15:20:36.461038 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hv4f2"] Nov 25 15:20:36 crc kubenswrapper[4965]: I1125 15:20:36.461459 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hv4f2" podUID="061157a9-59f8-48d9-b892-af578dd09f37" containerName="registry-server" containerID="cri-o://edfb30c2d2161f2670cbc93a6a47e0e626b4a323d88b8dedcf3ae0baf7689f1d" gracePeriod=2 Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.359200 4965 generic.go:334] "Generic (PLEG): container finished" podID="061157a9-59f8-48d9-b892-af578dd09f37" containerID="edfb30c2d2161f2670cbc93a6a47e0e626b4a323d88b8dedcf3ae0baf7689f1d" exitCode=0 Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.359346 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hv4f2" event={"ID":"061157a9-59f8-48d9-b892-af578dd09f37","Type":"ContainerDied","Data":"edfb30c2d2161f2670cbc93a6a47e0e626b4a323d88b8dedcf3ae0baf7689f1d"} Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.751212 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.891663 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzm9j\" (UniqueName: \"kubernetes.io/projected/061157a9-59f8-48d9-b892-af578dd09f37-kube-api-access-zzm9j\") pod \"061157a9-59f8-48d9-b892-af578dd09f37\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.891751 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-catalog-content\") pod \"061157a9-59f8-48d9-b892-af578dd09f37\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.891820 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-utilities\") pod \"061157a9-59f8-48d9-b892-af578dd09f37\" (UID: \"061157a9-59f8-48d9-b892-af578dd09f37\") " Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.893195 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-utilities" (OuterVolumeSpecName: "utilities") pod "061157a9-59f8-48d9-b892-af578dd09f37" (UID: "061157a9-59f8-48d9-b892-af578dd09f37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.900168 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061157a9-59f8-48d9-b892-af578dd09f37-kube-api-access-zzm9j" (OuterVolumeSpecName: "kube-api-access-zzm9j") pod "061157a9-59f8-48d9-b892-af578dd09f37" (UID: "061157a9-59f8-48d9-b892-af578dd09f37"). InnerVolumeSpecName "kube-api-access-zzm9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.941667 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "061157a9-59f8-48d9-b892-af578dd09f37" (UID: "061157a9-59f8-48d9-b892-af578dd09f37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.965851 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5vf7m" Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.992865 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzm9j\" (UniqueName: \"kubernetes.io/projected/061157a9-59f8-48d9-b892-af578dd09f37-kube-api-access-zzm9j\") on node \"crc\" DevicePath \"\"" Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.992891 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:20:38 crc kubenswrapper[4965]: I1125 15:20:38.992911 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061157a9-59f8-48d9-b892-af578dd09f37-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:20:39 crc kubenswrapper[4965]: I1125 15:20:39.322220 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-sbxmr" Nov 25 15:20:39 crc kubenswrapper[4965]: I1125 15:20:39.365000 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hv4f2" event={"ID":"061157a9-59f8-48d9-b892-af578dd09f37","Type":"ContainerDied","Data":"4148126a62d30f12f9d7133fd388271b5af8e8e50fef301eeeb53e093133beaf"} Nov 25 15:20:39 crc kubenswrapper[4965]: I1125 15:20:39.366122 4965 scope.go:117] "RemoveContainer" containerID="edfb30c2d2161f2670cbc93a6a47e0e626b4a323d88b8dedcf3ae0baf7689f1d" Nov 25 15:20:39 crc kubenswrapper[4965]: I1125 15:20:39.366395 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hv4f2" Nov 25 15:20:39 crc kubenswrapper[4965]: I1125 15:20:39.433420 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hv4f2"] Nov 25 15:20:39 crc kubenswrapper[4965]: I1125 15:20:39.437568 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hv4f2"] Nov 25 15:20:40 crc kubenswrapper[4965]: I1125 15:20:40.780321 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061157a9-59f8-48d9-b892-af578dd09f37" path="/var/lib/kubelet/pods/061157a9-59f8-48d9-b892-af578dd09f37/volumes" Nov 25 15:20:48 crc kubenswrapper[4965]: I1125 15:20:48.825749 4965 scope.go:117] "RemoveContainer" containerID="4c0f1282c255db6ad5f87092925bbcbfcc20e87ba04a9378048b85a11c39eff9" Nov 25 15:20:49 crc kubenswrapper[4965]: I1125 15:20:49.353027 4965 scope.go:117] "RemoveContainer" containerID="0cadeed6511f26ab600dee17f4fc112240cd28eec89dcf8176a0f2cc0888f667" Nov 25 15:20:54 crc kubenswrapper[4965]: I1125 15:20:54.506627 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t6t8g" event={"ID":"fe0fcbae-0cbb-4dc6-972b-106531288f4b","Type":"ContainerStarted","Data":"3fcd82dc0395305b64f4c35e835ee3dbff505ac6a166684c2f7c4d73a7d9fe4a"} Nov 25 15:20:54 crc kubenswrapper[4965]: I1125 15:20:54.506700 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-t6t8g" podUID="fe0fcbae-0cbb-4dc6-972b-106531288f4b" containerName="registry-server" containerID="cri-o://3fcd82dc0395305b64f4c35e835ee3dbff505ac6a166684c2f7c4d73a7d9fe4a" gracePeriod=2 Nov 25 15:20:54 crc kubenswrapper[4965]: I1125 15:20:54.508692 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xx52h" event={"ID":"c4e856d6-2a93-44e9-81c8-c965842a65d9","Type":"ContainerStarted","Data":"9fd5b79622d233a5938e1027bedd144f92c82e1ec5bd074a852e02a5d3b9eadd"} Nov 25 15:20:54 crc kubenswrapper[4965]: I1125 15:20:54.533670 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-t6t8g" podStartSLOduration=1.85562238 podStartE2EDuration="30.533636428s" podCreationTimestamp="2025-11-25 15:20:24 +0000 UTC" firstStartedPulling="2025-11-25 15:20:24.652494983 +0000 UTC m=+969.620088729" lastFinishedPulling="2025-11-25 15:20:53.330509001 +0000 UTC m=+998.298102777" observedRunningTime="2025-11-25 15:20:54.526207556 +0000 UTC m=+999.493801312" watchObservedRunningTime="2025-11-25 15:20:54.533636428 +0000 UTC m=+999.501230174" Nov 25 15:20:55 crc kubenswrapper[4965]: I1125 15:20:55.518395 4965 generic.go:334] "Generic (PLEG): container finished" podID="fe0fcbae-0cbb-4dc6-972b-106531288f4b" containerID="3fcd82dc0395305b64f4c35e835ee3dbff505ac6a166684c2f7c4d73a7d9fe4a" exitCode=0 Nov 25 15:20:55 crc kubenswrapper[4965]: I1125 15:20:55.519023 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t6t8g" event={"ID":"fe0fcbae-0cbb-4dc6-972b-106531288f4b","Type":"ContainerDied","Data":"3fcd82dc0395305b64f4c35e835ee3dbff505ac6a166684c2f7c4d73a7d9fe4a"} Nov 25 15:20:55 crc kubenswrapper[4965]: I1125 15:20:55.600781 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t6t8g" Nov 25 15:20:55 crc kubenswrapper[4965]: I1125 15:20:55.626654 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xx52h" podStartSLOduration=3.109602872 podStartE2EDuration="26.626631393s" podCreationTimestamp="2025-11-25 15:20:29 +0000 UTC" firstStartedPulling="2025-11-25 15:20:29.820598983 +0000 UTC m=+974.788192729" lastFinishedPulling="2025-11-25 15:20:53.337627464 +0000 UTC m=+998.305221250" observedRunningTime="2025-11-25 15:20:54.545702186 +0000 UTC m=+999.513295932" watchObservedRunningTime="2025-11-25 15:20:55.626631393 +0000 UTC m=+1000.594225149" Nov 25 15:20:55 crc kubenswrapper[4965]: I1125 15:20:55.740723 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkbnz\" (UniqueName: \"kubernetes.io/projected/fe0fcbae-0cbb-4dc6-972b-106531288f4b-kube-api-access-kkbnz\") pod \"fe0fcbae-0cbb-4dc6-972b-106531288f4b\" (UID: \"fe0fcbae-0cbb-4dc6-972b-106531288f4b\") " Nov 25 15:20:55 crc kubenswrapper[4965]: I1125 15:20:55.754763 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0fcbae-0cbb-4dc6-972b-106531288f4b-kube-api-access-kkbnz" (OuterVolumeSpecName: "kube-api-access-kkbnz") pod "fe0fcbae-0cbb-4dc6-972b-106531288f4b" (UID: "fe0fcbae-0cbb-4dc6-972b-106531288f4b"). InnerVolumeSpecName "kube-api-access-kkbnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:20:55 crc kubenswrapper[4965]: I1125 15:20:55.842919 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkbnz\" (UniqueName: \"kubernetes.io/projected/fe0fcbae-0cbb-4dc6-972b-106531288f4b-kube-api-access-kkbnz\") on node \"crc\" DevicePath \"\"" Nov 25 15:20:56 crc kubenswrapper[4965]: I1125 15:20:56.524466 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t6t8g" event={"ID":"fe0fcbae-0cbb-4dc6-972b-106531288f4b","Type":"ContainerDied","Data":"902751543f5fa14675eeaa10379159bb3219d178d1e5b8d58caab84ea926501f"} Nov 25 15:20:56 crc kubenswrapper[4965]: I1125 15:20:56.524521 4965 scope.go:117] "RemoveContainer" containerID="3fcd82dc0395305b64f4c35e835ee3dbff505ac6a166684c2f7c4d73a7d9fe4a" Nov 25 15:20:56 crc kubenswrapper[4965]: I1125 15:20:56.524524 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t6t8g" Nov 25 15:20:56 crc kubenswrapper[4965]: I1125 15:20:56.558265 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-t6t8g"] Nov 25 15:20:56 crc kubenswrapper[4965]: I1125 15:20:56.562710 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-t6t8g"] Nov 25 15:20:56 crc kubenswrapper[4965]: I1125 15:20:56.785271 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0fcbae-0cbb-4dc6-972b-106531288f4b" path="/var/lib/kubelet/pods/fe0fcbae-0cbb-4dc6-972b-106531288f4b/volumes" Nov 25 15:20:59 crc kubenswrapper[4965]: I1125 15:20:59.374652 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xx52h" Nov 25 15:20:59 crc kubenswrapper[4965]: I1125 15:20:59.375093 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xx52h" Nov 25 15:20:59 crc kubenswrapper[4965]: I1125 15:20:59.408394 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xx52h" Nov 25 15:20:59 crc kubenswrapper[4965]: I1125 15:20:59.590063 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xx52h" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.239610 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw"] Nov 25 15:21:02 crc kubenswrapper[4965]: E1125 15:21:02.240504 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fcbae-0cbb-4dc6-972b-106531288f4b" containerName="registry-server" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.240536 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fcbae-0cbb-4dc6-972b-106531288f4b" containerName="registry-server" Nov 25 15:21:02 crc kubenswrapper[4965]: E1125 15:21:02.240572 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061157a9-59f8-48d9-b892-af578dd09f37" containerName="extract-utilities" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.240590 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="061157a9-59f8-48d9-b892-af578dd09f37" containerName="extract-utilities" Nov 25 15:21:02 crc kubenswrapper[4965]: E1125 15:21:02.240628 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061157a9-59f8-48d9-b892-af578dd09f37" containerName="extract-content" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.240647 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="061157a9-59f8-48d9-b892-af578dd09f37" containerName="extract-content" Nov 25 15:21:02 crc kubenswrapper[4965]: E1125 15:21:02.240675 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061157a9-59f8-48d9-b892-af578dd09f37" containerName="registry-server" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.240692 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="061157a9-59f8-48d9-b892-af578dd09f37" containerName="registry-server" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.240960 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="061157a9-59f8-48d9-b892-af578dd09f37" containerName="registry-server" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.241027 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fcbae-0cbb-4dc6-972b-106531288f4b" containerName="registry-server" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.243033 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.245148 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tjntl" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.263511 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw"] Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.334852 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-bundle\") pod \"9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.334924 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9tcr\" (UniqueName: \"kubernetes.io/projected/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-kube-api-access-s9tcr\") pod \"9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.335038 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-util\") pod \"9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.436684 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-bundle\") pod \"9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.436824 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9tcr\" (UniqueName: \"kubernetes.io/projected/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-kube-api-access-s9tcr\") pod \"9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.437018 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-util\") pod \"9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.437814 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-bundle\") pod \"9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.439164 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-util\") pod \"9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.460171 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9tcr\" (UniqueName: \"kubernetes.io/projected/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-kube-api-access-s9tcr\") pod \"9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.566207 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:02 crc kubenswrapper[4965]: I1125 15:21:02.767306 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw"] Nov 25 15:21:03 crc kubenswrapper[4965]: I1125 15:21:03.592735 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" event={"ID":"3d3f08fa-5e27-46b5-b0dd-76e9860c0729","Type":"ContainerStarted","Data":"370ed73ff8cff88b7da5d0cc18b1008a43ead6b1e7d381befa16e63292997361"} Nov 25 15:21:03 crc kubenswrapper[4965]: I1125 15:21:03.593102 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" event={"ID":"3d3f08fa-5e27-46b5-b0dd-76e9860c0729","Type":"ContainerStarted","Data":"897311f0cc4d856d03cad9b149b424992c3b3706e44a7ebf4aff78a2e7f23acb"} Nov 25 15:21:04 crc kubenswrapper[4965]: I1125 15:21:04.604657 4965 generic.go:334] "Generic (PLEG): container finished" podID="3d3f08fa-5e27-46b5-b0dd-76e9860c0729" containerID="370ed73ff8cff88b7da5d0cc18b1008a43ead6b1e7d381befa16e63292997361" exitCode=0 Nov 25 15:21:04 crc kubenswrapper[4965]: I1125 15:21:04.604785 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" event={"ID":"3d3f08fa-5e27-46b5-b0dd-76e9860c0729","Type":"ContainerDied","Data":"370ed73ff8cff88b7da5d0cc18b1008a43ead6b1e7d381befa16e63292997361"} Nov 25 15:21:08 crc kubenswrapper[4965]: I1125 15:21:08.644338 4965 generic.go:334] "Generic (PLEG): container finished" podID="3d3f08fa-5e27-46b5-b0dd-76e9860c0729" containerID="1a9da40c5214298b5440a5a17756d7fa93760ecc2e219af2d8bf833217a2c960" exitCode=0 Nov 25 15:21:08 crc kubenswrapper[4965]: I1125 15:21:08.644422 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" event={"ID":"3d3f08fa-5e27-46b5-b0dd-76e9860c0729","Type":"ContainerDied","Data":"1a9da40c5214298b5440a5a17756d7fa93760ecc2e219af2d8bf833217a2c960"} Nov 25 15:21:09 crc kubenswrapper[4965]: I1125 15:21:09.652055 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" event={"ID":"3d3f08fa-5e27-46b5-b0dd-76e9860c0729","Type":"ContainerStarted","Data":"6b993cdce447c5220d7d1952ad841ca3ec7aee7b078f406655d894ae884003d5"} Nov 25 15:21:10 crc kubenswrapper[4965]: I1125 15:21:10.664192 4965 generic.go:334] "Generic (PLEG): container finished" podID="3d3f08fa-5e27-46b5-b0dd-76e9860c0729" containerID="6b993cdce447c5220d7d1952ad841ca3ec7aee7b078f406655d894ae884003d5" exitCode=0 Nov 25 15:21:10 crc kubenswrapper[4965]: I1125 15:21:10.664380 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" event={"ID":"3d3f08fa-5e27-46b5-b0dd-76e9860c0729","Type":"ContainerDied","Data":"6b993cdce447c5220d7d1952ad841ca3ec7aee7b078f406655d894ae884003d5"} Nov 25 15:21:11 crc kubenswrapper[4965]: I1125 15:21:11.923367 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.096838 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9tcr\" (UniqueName: \"kubernetes.io/projected/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-kube-api-access-s9tcr\") pod \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.097099 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-util\") pod \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.097280 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-bundle\") pod \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\" (UID: \"3d3f08fa-5e27-46b5-b0dd-76e9860c0729\") " Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.097868 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-bundle" (OuterVolumeSpecName: "bundle") pod "3d3f08fa-5e27-46b5-b0dd-76e9860c0729" (UID: "3d3f08fa-5e27-46b5-b0dd-76e9860c0729"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.103901 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-kube-api-access-s9tcr" (OuterVolumeSpecName: "kube-api-access-s9tcr") pod "3d3f08fa-5e27-46b5-b0dd-76e9860c0729" (UID: "3d3f08fa-5e27-46b5-b0dd-76e9860c0729"). InnerVolumeSpecName "kube-api-access-s9tcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.114265 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-util" (OuterVolumeSpecName: "util") pod "3d3f08fa-5e27-46b5-b0dd-76e9860c0729" (UID: "3d3f08fa-5e27-46b5-b0dd-76e9860c0729"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.198537 4965 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.198567 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9tcr\" (UniqueName: \"kubernetes.io/projected/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-kube-api-access-s9tcr\") on node \"crc\" DevicePath \"\"" Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.198578 4965 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d3f08fa-5e27-46b5-b0dd-76e9860c0729-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.683407 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" event={"ID":"3d3f08fa-5e27-46b5-b0dd-76e9860c0729","Type":"ContainerDied","Data":"897311f0cc4d856d03cad9b149b424992c3b3706e44a7ebf4aff78a2e7f23acb"} Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.684194 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="897311f0cc4d856d03cad9b149b424992c3b3706e44a7ebf4aff78a2e7f23acb" Nov 25 15:21:12 crc kubenswrapper[4965]: I1125 15:21:12.683540 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw" Nov 25 15:21:19 crc kubenswrapper[4965]: I1125 15:21:19.827587 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts"] Nov 25 15:21:19 crc kubenswrapper[4965]: E1125 15:21:19.829683 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3f08fa-5e27-46b5-b0dd-76e9860c0729" containerName="extract" Nov 25 15:21:19 crc kubenswrapper[4965]: I1125 15:21:19.829722 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3f08fa-5e27-46b5-b0dd-76e9860c0729" containerName="extract" Nov 25 15:21:19 crc kubenswrapper[4965]: E1125 15:21:19.829747 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3f08fa-5e27-46b5-b0dd-76e9860c0729" containerName="util" Nov 25 15:21:19 crc kubenswrapper[4965]: I1125 15:21:19.829760 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3f08fa-5e27-46b5-b0dd-76e9860c0729" containerName="util" Nov 25 15:21:19 crc kubenswrapper[4965]: E1125 15:21:19.829778 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3f08fa-5e27-46b5-b0dd-76e9860c0729" containerName="pull" Nov 25 15:21:19 crc kubenswrapper[4965]: I1125 15:21:19.829787 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3f08fa-5e27-46b5-b0dd-76e9860c0729" containerName="pull" Nov 25 15:21:19 crc kubenswrapper[4965]: I1125 15:21:19.846230 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3f08fa-5e27-46b5-b0dd-76e9860c0729" containerName="extract" Nov 25 15:21:19 crc kubenswrapper[4965]: I1125 15:21:19.850332 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts" Nov 25 15:21:19 crc kubenswrapper[4965]: I1125 15:21:19.853702 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts"] Nov 25 15:21:19 crc kubenswrapper[4965]: I1125 15:21:19.855285 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-cwvgd" Nov 25 15:21:19 crc kubenswrapper[4965]: I1125 15:21:19.995704 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x56hc\" (UniqueName: \"kubernetes.io/projected/716065f2-f9f2-41fd-a193-3e815b38456e-kube-api-access-x56hc\") pod \"openstack-operator-controller-operator-6d88ccc4fc-wz2ts\" (UID: \"716065f2-f9f2-41fd-a193-3e815b38456e\") " pod="openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts" Nov 25 15:21:20 crc kubenswrapper[4965]: I1125 15:21:20.096822 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x56hc\" (UniqueName: \"kubernetes.io/projected/716065f2-f9f2-41fd-a193-3e815b38456e-kube-api-access-x56hc\") pod \"openstack-operator-controller-operator-6d88ccc4fc-wz2ts\" (UID: \"716065f2-f9f2-41fd-a193-3e815b38456e\") " pod="openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts" Nov 25 15:21:20 crc kubenswrapper[4965]: I1125 15:21:20.117810 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x56hc\" (UniqueName: \"kubernetes.io/projected/716065f2-f9f2-41fd-a193-3e815b38456e-kube-api-access-x56hc\") pod \"openstack-operator-controller-operator-6d88ccc4fc-wz2ts\" (UID: \"716065f2-f9f2-41fd-a193-3e815b38456e\") " pod="openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts" Nov 25 15:21:20 crc kubenswrapper[4965]: I1125 15:21:20.186156 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts" Nov 25 15:21:20 crc kubenswrapper[4965]: I1125 15:21:20.465746 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts"] Nov 25 15:21:20 crc kubenswrapper[4965]: W1125 15:21:20.472232 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod716065f2_f9f2_41fd_a193_3e815b38456e.slice/crio-e3cb935909fcabd9034b9b7f12bbbc0a5c16a8ae652b0cf1bbbfd5abb5887f2a WatchSource:0}: Error finding container e3cb935909fcabd9034b9b7f12bbbc0a5c16a8ae652b0cf1bbbfd5abb5887f2a: Status 404 returned error can't find the container with id e3cb935909fcabd9034b9b7f12bbbc0a5c16a8ae652b0cf1bbbfd5abb5887f2a Nov 25 15:21:20 crc kubenswrapper[4965]: I1125 15:21:20.740885 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts" event={"ID":"716065f2-f9f2-41fd-a193-3e815b38456e","Type":"ContainerStarted","Data":"e3cb935909fcabd9034b9b7f12bbbc0a5c16a8ae652b0cf1bbbfd5abb5887f2a"} Nov 25 15:21:26 crc kubenswrapper[4965]: I1125 15:21:26.782107 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts" event={"ID":"716065f2-f9f2-41fd-a193-3e815b38456e","Type":"ContainerStarted","Data":"f233abd8b3c171b3ade35d9e81a3af975953260b605220dc885a0102827a7488"} Nov 25 15:21:26 crc kubenswrapper[4965]: I1125 15:21:26.782635 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts" Nov 25 15:21:26 crc kubenswrapper[4965]: I1125 15:21:26.862408 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts" podStartSLOduration=2.158920487 podStartE2EDuration="7.862389621s" podCreationTimestamp="2025-11-25 15:21:19 +0000 UTC" firstStartedPulling="2025-11-25 15:21:20.476321261 +0000 UTC m=+1025.443915007" lastFinishedPulling="2025-11-25 15:21:26.179790385 +0000 UTC m=+1031.147384141" observedRunningTime="2025-11-25 15:21:26.857858907 +0000 UTC m=+1031.825452673" watchObservedRunningTime="2025-11-25 15:21:26.862389621 +0000 UTC m=+1031.829983367" Nov 25 15:21:40 crc kubenswrapper[4965]: I1125 15:21:40.189550 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6d88ccc4fc-wz2ts" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.186327 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.189145 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.198993 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-8g42t" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.210478 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.215396 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.216385 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.230149 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-tspxb" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.237206 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.255976 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.258827 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.268307 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-5kq87" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.272040 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjdrm\" (UniqueName: \"kubernetes.io/projected/34390094-977c-4d9b-a9dd-8f4d4a5a89ad-kube-api-access-mjdrm\") pod \"barbican-operator-controller-manager-86dc4d89c8-7k79p\" (UID: \"34390094-977c-4d9b-a9dd-8f4d4a5a89ad\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.272103 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z8sf\" (UniqueName: \"kubernetes.io/projected/dbc985bf-ffef-456f-b4bd-37faeba9e8a1-kube-api-access-8z8sf\") pod \"designate-operator-controller-manager-7d695c9b56-7nxmk\" (UID: \"dbc985bf-ffef-456f-b4bd-37faeba9e8a1\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.272124 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zprtk\" (UniqueName: \"kubernetes.io/projected/764687bc-d3b6-47b3-96d8-8c31f47ab473-kube-api-access-zprtk\") pod \"cinder-operator-controller-manager-79856dc55c-wt2wg\" (UID: \"764687bc-d3b6-47b3-96d8-8c31f47ab473\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.279071 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.291111 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.292317 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.304742 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qfdxx" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.319049 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.341824 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-7fj92"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.344919 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.355554 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wlzhw" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.368419 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-7fj92"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.373403 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjdrm\" (UniqueName: \"kubernetes.io/projected/34390094-977c-4d9b-a9dd-8f4d4a5a89ad-kube-api-access-mjdrm\") pod \"barbican-operator-controller-manager-86dc4d89c8-7k79p\" (UID: \"34390094-977c-4d9b-a9dd-8f4d4a5a89ad\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.373689 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrbcz\" (UniqueName: \"kubernetes.io/projected/b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4-kube-api-access-vrbcz\") pod \"heat-operator-controller-manager-774b86978c-7fj92\" (UID: \"b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.373818 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z8sf\" (UniqueName: \"kubernetes.io/projected/dbc985bf-ffef-456f-b4bd-37faeba9e8a1-kube-api-access-8z8sf\") pod \"designate-operator-controller-manager-7d695c9b56-7nxmk\" (UID: \"dbc985bf-ffef-456f-b4bd-37faeba9e8a1\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.373914 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zprtk\" (UniqueName: \"kubernetes.io/projected/764687bc-d3b6-47b3-96d8-8c31f47ab473-kube-api-access-zprtk\") pod \"cinder-operator-controller-manager-79856dc55c-wt2wg\" (UID: \"764687bc-d3b6-47b3-96d8-8c31f47ab473\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.374089 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdrf\" (UniqueName: \"kubernetes.io/projected/579b7594-cdbd-4b63-9405-0321a133d2d0-kube-api-access-2zdrf\") pod \"glance-operator-controller-manager-68b95954c9-kdff2\" (UID: \"579b7594-cdbd-4b63-9405-0321a133d2d0\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.395877 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.396998 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.422955 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjdrm\" (UniqueName: \"kubernetes.io/projected/34390094-977c-4d9b-a9dd-8f4d4a5a89ad-kube-api-access-mjdrm\") pod \"barbican-operator-controller-manager-86dc4d89c8-7k79p\" (UID: \"34390094-977c-4d9b-a9dd-8f4d4a5a89ad\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.423306 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-75vtp" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.423830 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z8sf\" (UniqueName: \"kubernetes.io/projected/dbc985bf-ffef-456f-b4bd-37faeba9e8a1-kube-api-access-8z8sf\") pod \"designate-operator-controller-manager-7d695c9b56-7nxmk\" (UID: \"dbc985bf-ffef-456f-b4bd-37faeba9e8a1\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.435608 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zprtk\" (UniqueName: \"kubernetes.io/projected/764687bc-d3b6-47b3-96d8-8c31f47ab473-kube-api-access-zprtk\") pod \"cinder-operator-controller-manager-79856dc55c-wt2wg\" (UID: \"764687bc-d3b6-47b3-96d8-8c31f47ab473\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.455032 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.474012 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.475382 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.475586 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhsp\" (UniqueName: \"kubernetes.io/projected/756bbaba-31b3-4cd8-b3a6-6a3e0b805261-kube-api-access-tmhsp\") pod \"horizon-operator-controller-manager-68c9694994-v96z8\" (UID: \"756bbaba-31b3-4cd8-b3a6-6a3e0b805261\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.476010 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrbcz\" (UniqueName: \"kubernetes.io/projected/b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4-kube-api-access-vrbcz\") pod \"heat-operator-controller-manager-774b86978c-7fj92\" (UID: \"b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.476123 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdrf\" (UniqueName: \"kubernetes.io/projected/579b7594-cdbd-4b63-9405-0321a133d2d0-kube-api-access-2zdrf\") pod \"glance-operator-controller-manager-68b95954c9-kdff2\" (UID: \"579b7594-cdbd-4b63-9405-0321a133d2d0\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.479352 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-g8ccx" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.484082 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.488730 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.490067 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.499500 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-cszql" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.510075 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.514856 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.515288 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.527447 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-24796"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.528509 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.532137 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.535412 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zjr9k" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.555287 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdrf\" (UniqueName: \"kubernetes.io/projected/579b7594-cdbd-4b63-9405-0321a133d2d0-kube-api-access-2zdrf\") pod \"glance-operator-controller-manager-68b95954c9-kdff2\" (UID: \"579b7594-cdbd-4b63-9405-0321a133d2d0\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.577411 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrbcz\" (UniqueName: \"kubernetes.io/projected/b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4-kube-api-access-vrbcz\") pod \"heat-operator-controller-manager-774b86978c-7fj92\" (UID: \"b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.577731 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.577600 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl89j\" (UniqueName: \"kubernetes.io/projected/a3dd58f4-c4d6-43dc-b9fa-78d464337376-kube-api-access-kl89j\") pod \"infra-operator-controller-manager-d5cc86f4b-d5dnx\" (UID: \"a3dd58f4-c4d6-43dc-b9fa-78d464337376\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.578160 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmhsp\" (UniqueName: \"kubernetes.io/projected/756bbaba-31b3-4cd8-b3a6-6a3e0b805261-kube-api-access-tmhsp\") pod \"horizon-operator-controller-manager-68c9694994-v96z8\" (UID: \"756bbaba-31b3-4cd8-b3a6-6a3e0b805261\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.578300 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgsgd\" (UniqueName: \"kubernetes.io/projected/90d4de2d-51f0-4b18-8272-905b733fc714-kube-api-access-sgsgd\") pod \"keystone-operator-controller-manager-748dc6576f-24796\" (UID: \"90d4de2d-51f0-4b18-8272-905b733fc714\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.578572 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3dd58f4-c4d6-43dc-b9fa-78d464337376-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-d5dnx\" (UID: \"a3dd58f4-c4d6-43dc-b9fa-78d464337376\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.578675 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89rkw\" (UniqueName: \"kubernetes.io/projected/7fca11c0-cc43-457e-a797-610c31c9bc7f-kube-api-access-89rkw\") pod \"ironic-operator-controller-manager-5bfcdc958c-f6vxm\" (UID: \"7fca11c0-cc43-457e-a797-610c31c9bc7f\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.580163 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-24796"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.584925 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.596081 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.599468 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-sq294" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.614577 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.620267 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.662944 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmhsp\" (UniqueName: \"kubernetes.io/projected/756bbaba-31b3-4cd8-b3a6-6a3e0b805261-kube-api-access-tmhsp\") pod \"horizon-operator-controller-manager-68c9694994-v96z8\" (UID: \"756bbaba-31b3-4cd8-b3a6-6a3e0b805261\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.669011 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.681549 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl89j\" (UniqueName: \"kubernetes.io/projected/a3dd58f4-c4d6-43dc-b9fa-78d464337376-kube-api-access-kl89j\") pod \"infra-operator-controller-manager-d5cc86f4b-d5dnx\" (UID: \"a3dd58f4-c4d6-43dc-b9fa-78d464337376\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.681895 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgsgd\" (UniqueName: \"kubernetes.io/projected/90d4de2d-51f0-4b18-8272-905b733fc714-kube-api-access-sgsgd\") pod \"keystone-operator-controller-manager-748dc6576f-24796\" (UID: \"90d4de2d-51f0-4b18-8272-905b733fc714\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.681993 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9vwm\" (UniqueName: \"kubernetes.io/projected/6493c01f-7b22-4a04-9b25-b17ad7c790a1-kube-api-access-j9vwm\") pod \"manila-operator-controller-manager-58bb8d67cc-77tlb\" (UID: \"6493c01f-7b22-4a04-9b25-b17ad7c790a1\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.682162 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3dd58f4-c4d6-43dc-b9fa-78d464337376-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-d5dnx\" (UID: \"a3dd58f4-c4d6-43dc-b9fa-78d464337376\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.687261 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.693408 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" Nov 25 15:21:58 crc kubenswrapper[4965]: E1125 15:21:58.683189 4965 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 25 15:21:58 crc kubenswrapper[4965]: E1125 15:21:58.702182 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3dd58f4-c4d6-43dc-b9fa-78d464337376-cert podName:a3dd58f4-c4d6-43dc-b9fa-78d464337376 nodeName:}" failed. No retries permitted until 2025-11-25 15:21:59.202146008 +0000 UTC m=+1064.169739754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3dd58f4-c4d6-43dc-b9fa-78d464337376-cert") pod "infra-operator-controller-manager-d5cc86f4b-d5dnx" (UID: "a3dd58f4-c4d6-43dc-b9fa-78d464337376") : secret "infra-operator-webhook-server-cert" not found Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.703158 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-59wmc" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.695656 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89rkw\" (UniqueName: \"kubernetes.io/projected/7fca11c0-cc43-457e-a797-610c31c9bc7f-kube-api-access-89rkw\") pod \"ironic-operator-controller-manager-5bfcdc958c-f6vxm\" (UID: \"7fca11c0-cc43-457e-a797-610c31c9bc7f\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.709742 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl89j\" (UniqueName: \"kubernetes.io/projected/a3dd58f4-c4d6-43dc-b9fa-78d464337376-kube-api-access-kl89j\") pod \"infra-operator-controller-manager-d5cc86f4b-d5dnx\" (UID: \"a3dd58f4-c4d6-43dc-b9fa-78d464337376\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.728722 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89rkw\" (UniqueName: \"kubernetes.io/projected/7fca11c0-cc43-457e-a797-610c31c9bc7f-kube-api-access-89rkw\") pod \"ironic-operator-controller-manager-5bfcdc958c-f6vxm\" (UID: \"7fca11c0-cc43-457e-a797-610c31c9bc7f\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.745212 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.750049 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.751079 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.756638 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgsgd\" (UniqueName: \"kubernetes.io/projected/90d4de2d-51f0-4b18-8272-905b733fc714-kube-api-access-sgsgd\") pod \"keystone-operator-controller-manager-748dc6576f-24796\" (UID: \"90d4de2d-51f0-4b18-8272-905b733fc714\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.758838 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.760097 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.781564 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.789824 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-9g8rh" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.790045 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-42v4s" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.805745 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldd6\" (UniqueName: \"kubernetes.io/projected/af96aeb0-49ef-430d-9780-791c7a1b64da-kube-api-access-tldd6\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-j9tml\" (UID: \"af96aeb0-49ef-430d-9780-791c7a1b64da\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.806165 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nsrv\" (UniqueName: \"kubernetes.io/projected/2b7be07d-fe11-494c-97b3-fa95b997450f-kube-api-access-9nsrv\") pod \"neutron-operator-controller-manager-7c57c8bbc4-cf9pl\" (UID: \"2b7be07d-fe11-494c-97b3-fa95b997450f\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.806355 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9vwm\" (UniqueName: \"kubernetes.io/projected/6493c01f-7b22-4a04-9b25-b17ad7c790a1-kube-api-access-j9vwm\") pod \"manila-operator-controller-manager-58bb8d67cc-77tlb\" (UID: \"6493c01f-7b22-4a04-9b25-b17ad7c790a1\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.806461 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsqt6\" (UniqueName: \"kubernetes.io/projected/f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a-kube-api-access-zsqt6\") pod \"nova-operator-controller-manager-79556f57fc-8xprb\" (UID: \"f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.814313 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.829700 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.840613 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.841747 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.847269 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-h5t4t" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.853558 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9vwm\" (UniqueName: \"kubernetes.io/projected/6493c01f-7b22-4a04-9b25-b17ad7c790a1-kube-api-access-j9vwm\") pod \"manila-operator-controller-manager-58bb8d67cc-77tlb\" (UID: \"6493c01f-7b22-4a04-9b25-b17ad7c790a1\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.865478 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt"] Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.895036 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.907185 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.907767 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzgsz\" (UniqueName: \"kubernetes.io/projected/09d0b7cc-6fc4-40dd-a332-b405d049e756-kube-api-access-kzgsz\") pod \"octavia-operator-controller-manager-fd75fd47d-6hhqt\" (UID: \"09d0b7cc-6fc4-40dd-a332-b405d049e756\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.907823 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsqt6\" (UniqueName: \"kubernetes.io/projected/f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a-kube-api-access-zsqt6\") pod \"nova-operator-controller-manager-79556f57fc-8xprb\" (UID: \"f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.907877 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tldd6\" (UniqueName: \"kubernetes.io/projected/af96aeb0-49ef-430d-9780-791c7a1b64da-kube-api-access-tldd6\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-j9tml\" (UID: \"af96aeb0-49ef-430d-9780-791c7a1b64da\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.907904 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nsrv\" (UniqueName: \"kubernetes.io/projected/2b7be07d-fe11-494c-97b3-fa95b997450f-kube-api-access-9nsrv\") pod \"neutron-operator-controller-manager-7c57c8bbc4-cf9pl\" (UID: \"2b7be07d-fe11-494c-97b3-fa95b997450f\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" Nov 25 15:21:58 crc kubenswrapper[4965]: I1125 15:21:58.933387 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.000467 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.001821 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.008949 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzgsz\" (UniqueName: \"kubernetes.io/projected/09d0b7cc-6fc4-40dd-a332-b405d049e756-kube-api-access-kzgsz\") pod \"octavia-operator-controller-manager-fd75fd47d-6hhqt\" (UID: \"09d0b7cc-6fc4-40dd-a332-b405d049e756\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.009837 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.010097 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zmhpz" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.016869 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.018235 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.051430 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2nfh5" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.062121 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldd6\" (UniqueName: \"kubernetes.io/projected/af96aeb0-49ef-430d-9780-791c7a1b64da-kube-api-access-tldd6\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-j9tml\" (UID: \"af96aeb0-49ef-430d-9780-791c7a1b64da\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.068287 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsqt6\" (UniqueName: \"kubernetes.io/projected/f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a-kube-api-access-zsqt6\") pod \"nova-operator-controller-manager-79556f57fc-8xprb\" (UID: \"f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.074202 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.076337 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nsrv\" (UniqueName: \"kubernetes.io/projected/2b7be07d-fe11-494c-97b3-fa95b997450f-kube-api-access-9nsrv\") pod \"neutron-operator-controller-manager-7c57c8bbc4-cf9pl\" (UID: \"2b7be07d-fe11-494c-97b3-fa95b997450f\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.081272 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.109605 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.110660 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzgsz\" (UniqueName: \"kubernetes.io/projected/09d0b7cc-6fc4-40dd-a332-b405d049e756-kube-api-access-kzgsz\") pod \"octavia-operator-controller-manager-fd75fd47d-6hhqt\" (UID: \"09d0b7cc-6fc4-40dd-a332-b405d049e756\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.111246 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whjd\" (UniqueName: \"kubernetes.io/projected/d781d742-fdc4-4480-90a3-6330b4add384-kube-api-access-9whjd\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5\" (UID: \"d781d742-fdc4-4480-90a3-6330b4add384\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.111293 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d781d742-fdc4-4480-90a3-6330b4add384-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5\" (UID: \"d781d742-fdc4-4480-90a3-6330b4add384\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.111341 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44p58\" (UniqueName: \"kubernetes.io/projected/0952a381-bcc9-46de-bdac-bf2bdfe6ecc4-kube-api-access-44p58\") pod \"ovn-operator-controller-manager-66cf5c67ff-fwc8g\" (UID: \"0952a381-bcc9-46de-bdac-bf2bdfe6ecc4\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.123496 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.130360 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.153958 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.161289 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wsckc" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.196775 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.200124 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.214197 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9whjd\" (UniqueName: \"kubernetes.io/projected/d781d742-fdc4-4480-90a3-6330b4add384-kube-api-access-9whjd\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5\" (UID: \"d781d742-fdc4-4480-90a3-6330b4add384\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.214257 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3dd58f4-c4d6-43dc-b9fa-78d464337376-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-d5dnx\" (UID: \"a3dd58f4-c4d6-43dc-b9fa-78d464337376\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.214283 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d781d742-fdc4-4480-90a3-6330b4add384-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5\" (UID: \"d781d742-fdc4-4480-90a3-6330b4add384\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.214315 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqs4v\" (UniqueName: \"kubernetes.io/projected/cd724754-7539-4700-9911-5c0ce503d70f-kube-api-access-nqs4v\") pod \"placement-operator-controller-manager-5db546f9d9-7fzrj\" (UID: \"cd724754-7539-4700-9911-5c0ce503d70f\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.214371 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44p58\" (UniqueName: \"kubernetes.io/projected/0952a381-bcc9-46de-bdac-bf2bdfe6ecc4-kube-api-access-44p58\") pod \"ovn-operator-controller-manager-66cf5c67ff-fwc8g\" (UID: \"0952a381-bcc9-46de-bdac-bf2bdfe6ecc4\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" Nov 25 15:21:59 crc kubenswrapper[4965]: E1125 15:21:59.215383 4965 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 25 15:21:59 crc kubenswrapper[4965]: E1125 15:21:59.215438 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3dd58f4-c4d6-43dc-b9fa-78d464337376-cert podName:a3dd58f4-c4d6-43dc-b9fa-78d464337376 nodeName:}" failed. No retries permitted until 2025-11-25 15:22:00.215421583 +0000 UTC m=+1065.183015329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3dd58f4-c4d6-43dc-b9fa-78d464337376-cert") pod "infra-operator-controller-manager-d5cc86f4b-d5dnx" (UID: "a3dd58f4-c4d6-43dc-b9fa-78d464337376") : secret "infra-operator-webhook-server-cert" not found Nov 25 15:21:59 crc kubenswrapper[4965]: E1125 15:21:59.215987 4965 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 15:21:59 crc kubenswrapper[4965]: E1125 15:21:59.216017 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d781d742-fdc4-4480-90a3-6330b4add384-cert podName:d781d742-fdc4-4480-90a3-6330b4add384 nodeName:}" failed. No retries permitted until 2025-11-25 15:21:59.716008059 +0000 UTC m=+1064.683601865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d781d742-fdc4-4480-90a3-6330b4add384-cert") pod "openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" (UID: "d781d742-fdc4-4480-90a3-6330b4add384") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.230068 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.231196 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.255279 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.261273 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.267340 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-9kpwv" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.273039 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.279887 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44p58\" (UniqueName: \"kubernetes.io/projected/0952a381-bcc9-46de-bdac-bf2bdfe6ecc4-kube-api-access-44p58\") pod \"ovn-operator-controller-manager-66cf5c67ff-fwc8g\" (UID: \"0952a381-bcc9-46de-bdac-bf2bdfe6ecc4\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.293677 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.304142 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hk6gw" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.310657 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whjd\" (UniqueName: \"kubernetes.io/projected/d781d742-fdc4-4480-90a3-6330b4add384-kube-api-access-9whjd\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5\" (UID: \"d781d742-fdc4-4480-90a3-6330b4add384\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.310726 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.315650 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jftg7\" (UniqueName: \"kubernetes.io/projected/3ae7668e-2e54-482f-9340-8ffe413de1d1-kube-api-access-jftg7\") pod \"swift-operator-controller-manager-6fdc4fcf86-hg488\" (UID: \"3ae7668e-2e54-482f-9340-8ffe413de1d1\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.315768 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqs4v\" (UniqueName: \"kubernetes.io/projected/cd724754-7539-4700-9911-5c0ce503d70f-kube-api-access-nqs4v\") pod \"placement-operator-controller-manager-5db546f9d9-7fzrj\" (UID: \"cd724754-7539-4700-9911-5c0ce503d70f\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.315832 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxshh\" (UniqueName: \"kubernetes.io/projected/31a52118-75f6-4e53-a6b6-fd6378c61df8-kube-api-access-mxshh\") pod \"telemetry-operator-controller-manager-567f98c9d-xxb4h\" (UID: \"31a52118-75f6-4e53-a6b6-fd6378c61df8\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.325028 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.335563 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hbrp4" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.336200 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.371601 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqs4v\" (UniqueName: \"kubernetes.io/projected/cd724754-7539-4700-9911-5c0ce503d70f-kube-api-access-nqs4v\") pod \"placement-operator-controller-manager-5db546f9d9-7fzrj\" (UID: \"cd724754-7539-4700-9911-5c0ce503d70f\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.381044 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.413232 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.420747 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxshh\" (UniqueName: \"kubernetes.io/projected/31a52118-75f6-4e53-a6b6-fd6378c61df8-kube-api-access-mxshh\") pod \"telemetry-operator-controller-manager-567f98c9d-xxb4h\" (UID: \"31a52118-75f6-4e53-a6b6-fd6378c61df8\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.420822 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jftg7\" (UniqueName: \"kubernetes.io/projected/3ae7668e-2e54-482f-9340-8ffe413de1d1-kube-api-access-jftg7\") pod \"swift-operator-controller-manager-6fdc4fcf86-hg488\" (UID: \"3ae7668e-2e54-482f-9340-8ffe413de1d1\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.420917 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfpsw\" (UniqueName: \"kubernetes.io/projected/b85374cc-6464-4dd6-9c38-0cabb8fd7834-kube-api-access-rfpsw\") pod \"test-operator-controller-manager-5cb74df96-wmm8k\" (UID: \"b85374cc-6464-4dd6-9c38-0cabb8fd7834\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.443442 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-k6bww"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.476642 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.497266 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-hnwfj" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.519547 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.520357 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxshh\" (UniqueName: \"kubernetes.io/projected/31a52118-75f6-4e53-a6b6-fd6378c61df8-kube-api-access-mxshh\") pod \"telemetry-operator-controller-manager-567f98c9d-xxb4h\" (UID: \"31a52118-75f6-4e53-a6b6-fd6378c61df8\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.524488 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfpsw\" (UniqueName: \"kubernetes.io/projected/b85374cc-6464-4dd6-9c38-0cabb8fd7834-kube-api-access-rfpsw\") pod \"test-operator-controller-manager-5cb74df96-wmm8k\" (UID: \"b85374cc-6464-4dd6-9c38-0cabb8fd7834\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.537462 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkdh2\" (UniqueName: \"kubernetes.io/projected/cae07c7e-b337-46bb-8b04-06643ee9e6c3-kube-api-access-tkdh2\") pod \"watcher-operator-controller-manager-864885998-k6bww\" (UID: \"cae07c7e-b337-46bb-8b04-06643ee9e6c3\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.534760 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-k6bww"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.575458 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jftg7\" (UniqueName: \"kubernetes.io/projected/3ae7668e-2e54-482f-9340-8ffe413de1d1-kube-api-access-jftg7\") pod \"swift-operator-controller-manager-6fdc4fcf86-hg488\" (UID: \"3ae7668e-2e54-482f-9340-8ffe413de1d1\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.626084 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfpsw\" (UniqueName: \"kubernetes.io/projected/b85374cc-6464-4dd6-9c38-0cabb8fd7834-kube-api-access-rfpsw\") pod \"test-operator-controller-manager-5cb74df96-wmm8k\" (UID: \"b85374cc-6464-4dd6-9c38-0cabb8fd7834\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.662180 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.662493 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkdh2\" (UniqueName: \"kubernetes.io/projected/cae07c7e-b337-46bb-8b04-06643ee9e6c3-kube-api-access-tkdh2\") pod \"watcher-operator-controller-manager-864885998-k6bww\" (UID: \"cae07c7e-b337-46bb-8b04-06643ee9e6c3\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.662689 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.663593 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.770137 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d781d742-fdc4-4480-90a3-6330b4add384-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5\" (UID: \"d781d742-fdc4-4480-90a3-6330b4add384\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:21:59 crc kubenswrapper[4965]: E1125 15:21:59.770324 4965 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 15:21:59 crc kubenswrapper[4965]: E1125 15:21:59.770377 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d781d742-fdc4-4480-90a3-6330b4add384-cert podName:d781d742-fdc4-4480-90a3-6330b4add384 nodeName:}" failed. No retries permitted until 2025-11-25 15:22:00.77036168 +0000 UTC m=+1065.737955426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d781d742-fdc4-4480-90a3-6330b4add384-cert") pod "openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" (UID: "d781d742-fdc4-4480-90a3-6330b4add384") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.783889 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkdh2\" (UniqueName: \"kubernetes.io/projected/cae07c7e-b337-46bb-8b04-06643ee9e6c3-kube-api-access-tkdh2\") pod \"watcher-operator-controller-manager-864885998-k6bww\" (UID: \"cae07c7e-b337-46bb-8b04-06643ee9e6c3\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.829450 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-7fj92"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.870593 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.871577 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.915654 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.916085 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-kpk6h" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.916204 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.929104 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.936136 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq"] Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.994622 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr95l\" (UniqueName: \"kubernetes.io/projected/d8bdaece-696d-4306-a66b-46c7333eb788-kube-api-access-mr95l\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.994719 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-webhook-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:21:59 crc kubenswrapper[4965]: I1125 15:21:59.994770 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.011484 4965 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.031097 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6"] Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.031915 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.037650 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-48rcc" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.056495 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" event={"ID":"b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4","Type":"ContainerStarted","Data":"9a75afd70ac73be0186f01f6b1a71eed923c9d0336d8304dd5c446f894dd2292"} Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.098031 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6"] Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.098695 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr95l\" (UniqueName: \"kubernetes.io/projected/d8bdaece-696d-4306-a66b-46c7333eb788-kube-api-access-mr95l\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.098766 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-webhook-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.098792 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:00 crc kubenswrapper[4965]: E1125 15:22:00.098936 4965 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 25 15:22:00 crc kubenswrapper[4965]: E1125 15:22:00.098996 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs podName:d8bdaece-696d-4306-a66b-46c7333eb788 nodeName:}" failed. No retries permitted until 2025-11-25 15:22:00.598980648 +0000 UTC m=+1065.566574394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs") pod "openstack-operator-controller-manager-68587559f4-7lqhq" (UID: "d8bdaece-696d-4306-a66b-46c7333eb788") : secret "metrics-server-cert" not found Nov 25 15:22:00 crc kubenswrapper[4965]: E1125 15:22:00.099433 4965 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 15:22:00 crc kubenswrapper[4965]: E1125 15:22:00.099459 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-webhook-certs podName:d8bdaece-696d-4306-a66b-46c7333eb788 nodeName:}" failed. No retries permitted until 2025-11-25 15:22:00.599451571 +0000 UTC m=+1065.567045317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-webhook-certs") pod "openstack-operator-controller-manager-68587559f4-7lqhq" (UID: "d8bdaece-696d-4306-a66b-46c7333eb788") : secret "webhook-server-cert" not found Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.123459 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr95l\" (UniqueName: \"kubernetes.io/projected/d8bdaece-696d-4306-a66b-46c7333eb788-kube-api-access-mr95l\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.201647 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9hbv\" (UniqueName: \"kubernetes.io/projected/a5374ce4-8ac3-422b-9d62-9412dea697d3-kube-api-access-p9hbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kckh6\" (UID: \"a5374ce4-8ac3-422b-9d62-9412dea697d3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.278728 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p"] Nov 25 15:22:00 crc kubenswrapper[4965]: W1125 15:22:00.285346 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34390094_977c_4d9b_a9dd_8f4d4a5a89ad.slice/crio-f14ffaf94bfa2f15a113373cc26689e503f92491cfb499aef0d350d2585301b5 WatchSource:0}: Error finding container f14ffaf94bfa2f15a113373cc26689e503f92491cfb499aef0d350d2585301b5: Status 404 returned error can't find the container with id f14ffaf94bfa2f15a113373cc26689e503f92491cfb499aef0d350d2585301b5 Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.302278 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2"] Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.302767 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9hbv\" (UniqueName: \"kubernetes.io/projected/a5374ce4-8ac3-422b-9d62-9412dea697d3-kube-api-access-p9hbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kckh6\" (UID: \"a5374ce4-8ac3-422b-9d62-9412dea697d3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.302820 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3dd58f4-c4d6-43dc-b9fa-78d464337376-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-d5dnx\" (UID: \"a3dd58f4-c4d6-43dc-b9fa-78d464337376\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.311553 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3dd58f4-c4d6-43dc-b9fa-78d464337376-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-d5dnx\" (UID: \"a3dd58f4-c4d6-43dc-b9fa-78d464337376\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.333629 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9hbv\" (UniqueName: \"kubernetes.io/projected/a5374ce4-8ac3-422b-9d62-9412dea697d3-kube-api-access-p9hbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kckh6\" (UID: \"a5374ce4-8ac3-422b-9d62-9412dea697d3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.352039 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg"] Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.382903 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.393635 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk"] Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.458487 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl"] Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.602494 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.608539 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-24796"] Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.614358 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-webhook-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.614431 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:00 crc kubenswrapper[4965]: E1125 15:22:00.614726 4965 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 25 15:22:00 crc kubenswrapper[4965]: E1125 15:22:00.616089 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs podName:d8bdaece-696d-4306-a66b-46c7333eb788 nodeName:}" failed. No retries permitted until 2025-11-25 15:22:01.614768221 +0000 UTC m=+1066.582361967 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs") pod "openstack-operator-controller-manager-68587559f4-7lqhq" (UID: "d8bdaece-696d-4306-a66b-46c7333eb788") : secret "metrics-server-cert" not found Nov 25 15:22:00 crc kubenswrapper[4965]: E1125 15:22:00.616450 4965 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 15:22:00 crc kubenswrapper[4965]: E1125 15:22:00.616478 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-webhook-certs podName:d8bdaece-696d-4306-a66b-46c7333eb788 nodeName:}" failed. No retries permitted until 2025-11-25 15:22:01.616469757 +0000 UTC m=+1066.584063503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-webhook-certs") pod "openstack-operator-controller-manager-68587559f4-7lqhq" (UID: "d8bdaece-696d-4306-a66b-46c7333eb788") : secret "webhook-server-cert" not found Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.739366 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm"] Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.799301 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb"] Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.810517 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8"] Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.818205 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d781d742-fdc4-4480-90a3-6330b4add384-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5\" (UID: \"d781d742-fdc4-4480-90a3-6330b4add384\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.826050 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d781d742-fdc4-4480-90a3-6330b4add384-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5\" (UID: \"d781d742-fdc4-4480-90a3-6330b4add384\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:22:00 crc kubenswrapper[4965]: I1125 15:22:00.867784 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.036733 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml"] Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.052839 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g"] Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.108397 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt"] Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.130658 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488"] Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.133623 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" event={"ID":"579b7594-cdbd-4b63-9405-0321a133d2d0","Type":"ContainerStarted","Data":"c754d06319e10a87758784a98a6f63066fd5da29ebf3a9290f9bdb93ad98f351"} Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.137761 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" event={"ID":"2b7be07d-fe11-494c-97b3-fa95b997450f","Type":"ContainerStarted","Data":"0b36439ce30203d550fb8101792ec7e2ce7246800a16654375ddd4b065dbd27b"} Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.138813 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" event={"ID":"af96aeb0-49ef-430d-9780-791c7a1b64da","Type":"ContainerStarted","Data":"2a2b6386e2f41b9c307531eeac6a21547562fe1e8cd42310b76adf2f2d43aa4e"} Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.139625 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" event={"ID":"756bbaba-31b3-4cd8-b3a6-6a3e0b805261","Type":"ContainerStarted","Data":"3221b72f4bd3cdae17d9fab077994e8d51e61dcdf49a2516a1ab4d109ebacf88"} Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.140304 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" event={"ID":"764687bc-d3b6-47b3-96d8-8c31f47ab473","Type":"ContainerStarted","Data":"d7e994ead3e508966470fb5215b3c028660512a014bdec7d94529c8fe70a4e5f"} Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.140939 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" event={"ID":"34390094-977c-4d9b-a9dd-8f4d4a5a89ad","Type":"ContainerStarted","Data":"f14ffaf94bfa2f15a113373cc26689e503f92491cfb499aef0d350d2585301b5"} Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.141605 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" event={"ID":"6493c01f-7b22-4a04-9b25-b17ad7c790a1","Type":"ContainerStarted","Data":"cda9af3e9d3d0504e2273a9602f2aaabc364687bf3a9d7dafd1d4d22cd7ce78d"} Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.142348 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" event={"ID":"90d4de2d-51f0-4b18-8272-905b733fc714","Type":"ContainerStarted","Data":"0bdfba21e1670ffa240eecbc7d7d0f75a109b5f9987c9bd7d43b730cb3e114b7"} Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.143066 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" event={"ID":"dbc985bf-ffef-456f-b4bd-37faeba9e8a1","Type":"ContainerStarted","Data":"c4cd1105eeec474131312a14d858b24b0a2ae0a4347a187d3f00c7e9422b706b"} Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.143950 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" event={"ID":"7fca11c0-cc43-457e-a797-610c31c9bc7f","Type":"ContainerStarted","Data":"abffdbc0f52aff6e809c3d714d4917e88fcf45bde67300b7fb8702d4e9edc369"} Nov 25 15:22:01 crc kubenswrapper[4965]: W1125 15:22:01.176376 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0952a381_bcc9_46de_bdac_bf2bdfe6ecc4.slice/crio-3bba2be92302a16af20334c85a7ba62067fa78c15569e819d71b44d8071e9f3b WatchSource:0}: Error finding container 3bba2be92302a16af20334c85a7ba62067fa78c15569e819d71b44d8071e9f3b: Status 404 returned error can't find the container with id 3bba2be92302a16af20334c85a7ba62067fa78c15569e819d71b44d8071e9f3b Nov 25 15:22:01 crc kubenswrapper[4965]: W1125 15:22:01.176659 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae7668e_2e54_482f_9340_8ffe413de1d1.slice/crio-ed6ea3cef12a92f00c57979220ce8ac317407228af88abcb5878d8fda9d1ab37 WatchSource:0}: Error finding container ed6ea3cef12a92f00c57979220ce8ac317407228af88abcb5878d8fda9d1ab37: Status 404 returned error can't find the container with id ed6ea3cef12a92f00c57979220ce8ac317407228af88abcb5878d8fda9d1ab37 Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.191892 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj"] Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.194979 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-k6bww"] Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.211075 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb"] Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.216027 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k"] Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.269637 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h"] Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.280295 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tkdh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-864885998-k6bww_openstack-operators(cae07c7e-b337-46bb-8b04-06643ee9e6c3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.283152 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tkdh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-864885998-k6bww_openstack-operators(cae07c7e-b337-46bb-8b04-06643ee9e6c3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.284540 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" podUID="cae07c7e-b337-46bb-8b04-06643ee9e6c3" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.295588 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxshh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-567f98c9d-xxb4h_openstack-operators(31a52118-75f6-4e53-a6b6-fd6378c61df8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.295766 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rfpsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cb74df96-wmm8k_openstack-operators(b85374cc-6464-4dd6-9c38-0cabb8fd7834): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.297830 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rfpsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cb74df96-wmm8k_openstack-operators(b85374cc-6464-4dd6-9c38-0cabb8fd7834): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.299998 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" podUID="b85374cc-6464-4dd6-9c38-0cabb8fd7834" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.316338 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zsqt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-79556f57fc-8xprb_openstack-operators(f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.320554 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zsqt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-79556f57fc-8xprb_openstack-operators(f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.321951 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" podUID="f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a" Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.377915 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5"] Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.397485 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6"] Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.399003 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9whjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5_openstack-operators(d781d742-fdc4-4480-90a3-6330b4add384): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.408476 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9whjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5_openstack-operators(d781d742-fdc4-4480-90a3-6330b4add384): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.410174 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" podUID="d781d742-fdc4-4480-90a3-6330b4add384" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.415397 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p9hbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-kckh6_openstack-operators(a5374ce4-8ac3-422b-9d62-9412dea697d3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.417150 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" podUID="a5374ce4-8ac3-422b-9d62-9412dea697d3" Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.430880 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx"] Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.448177 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:86df58f744c1d23233cc98f6ea17c8d6da637c50003d0fc8c100045594aa9894,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kl89j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-d5cc86f4b-d5dnx_openstack-operators(a3dd58f4-c4d6-43dc-b9fa-78d464337376): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.452517 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kl89j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-d5cc86f4b-d5dnx_openstack-operators(a3dd58f4-c4d6-43dc-b9fa-78d464337376): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.454297 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" podUID="a3dd58f4-c4d6-43dc-b9fa-78d464337376" Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.640587 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-webhook-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.640636 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.640815 4965 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 25 15:22:01 crc kubenswrapper[4965]: E1125 15:22:01.640876 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs podName:d8bdaece-696d-4306-a66b-46c7333eb788 nodeName:}" failed. No retries permitted until 2025-11-25 15:22:03.640848549 +0000 UTC m=+1068.608442295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs") pod "openstack-operator-controller-manager-68587559f4-7lqhq" (UID: "d8bdaece-696d-4306-a66b-46c7333eb788") : secret "metrics-server-cert" not found Nov 25 15:22:01 crc kubenswrapper[4965]: I1125 15:22:01.658277 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-webhook-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.154093 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" event={"ID":"d781d742-fdc4-4480-90a3-6330b4add384","Type":"ContainerStarted","Data":"17ae1878f089cd36d6f61b3f52a90df240b3b5c7a58c2abe6a4249d4e36bfd1e"} Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.156089 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" event={"ID":"09d0b7cc-6fc4-40dd-a332-b405d049e756","Type":"ContainerStarted","Data":"29ff29c1cef093bef8e92f6ee4c6e527979ad477698e041137c887bef19c3803"} Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.156988 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" event={"ID":"b85374cc-6464-4dd6-9c38-0cabb8fd7834","Type":"ContainerStarted","Data":"45eca21062160a8dd14148f540efbec590d3ba1d1ccea974f5c4f90760b6e6f4"} Nov 25 15:22:02 crc kubenswrapper[4965]: E1125 15:22:02.159896 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" podUID="d781d742-fdc4-4480-90a3-6330b4add384" Nov 25 15:22:02 crc kubenswrapper[4965]: E1125 15:22:02.160371 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" podUID="b85374cc-6464-4dd6-9c38-0cabb8fd7834" Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.160997 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" event={"ID":"cae07c7e-b337-46bb-8b04-06643ee9e6c3","Type":"ContainerStarted","Data":"4bee34351c5d0d4062f992289f364e72c4c7e5a358ac2400f96cff3f6e108875"} Nov 25 15:22:02 crc kubenswrapper[4965]: E1125 15:22:02.168755 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" podUID="cae07c7e-b337-46bb-8b04-06643ee9e6c3" Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.171072 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" event={"ID":"31a52118-75f6-4e53-a6b6-fd6378c61df8","Type":"ContainerStarted","Data":"c6c587774ff2e453fb660ec9bc96a3745426c4f5ababb6f4da19f3477e7f098d"} Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.172909 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" event={"ID":"f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a","Type":"ContainerStarted","Data":"cc67087ed7bce00610195e11aeb7593d88f5034929ed8a4b6a5879cc834ac07f"} Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.174504 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" event={"ID":"0952a381-bcc9-46de-bdac-bf2bdfe6ecc4","Type":"ContainerStarted","Data":"3bba2be92302a16af20334c85a7ba62067fa78c15569e819d71b44d8071e9f3b"} Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.176227 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" event={"ID":"cd724754-7539-4700-9911-5c0ce503d70f","Type":"ContainerStarted","Data":"5b21cb32b240e77eb81675be5ea73d94d46195d812e198fb3dffb233913dedb2"} Nov 25 15:22:02 crc kubenswrapper[4965]: E1125 15:22:02.176427 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" podUID="f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a" Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.187942 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" event={"ID":"a3dd58f4-c4d6-43dc-b9fa-78d464337376","Type":"ContainerStarted","Data":"9e436c573fc91d14484ef9cf5576f01a981cbfb9cee85da7cd79ead60caf6e05"} Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.189363 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" event={"ID":"a5374ce4-8ac3-422b-9d62-9412dea697d3","Type":"ContainerStarted","Data":"d57c8f7848ae2f5a669ebf8ed6fe3e3aa44a8a11ba837d4369276d047b33a22d"} Nov 25 15:22:02 crc kubenswrapper[4965]: I1125 15:22:02.190297 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" event={"ID":"3ae7668e-2e54-482f-9340-8ffe413de1d1","Type":"ContainerStarted","Data":"ed6ea3cef12a92f00c57979220ce8ac317407228af88abcb5878d8fda9d1ab37"} Nov 25 15:22:02 crc kubenswrapper[4965]: E1125 15:22:02.190851 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:86df58f744c1d23233cc98f6ea17c8d6da637c50003d0fc8c100045594aa9894\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" podUID="a3dd58f4-c4d6-43dc-b9fa-78d464337376" Nov 25 15:22:02 crc kubenswrapper[4965]: E1125 15:22:02.191461 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" podUID="a5374ce4-8ac3-422b-9d62-9412dea697d3" Nov 25 15:22:03 crc kubenswrapper[4965]: E1125 15:22:03.200524 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" podUID="a5374ce4-8ac3-422b-9d62-9412dea697d3" Nov 25 15:22:03 crc kubenswrapper[4965]: E1125 15:22:03.201928 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:86df58f744c1d23233cc98f6ea17c8d6da637c50003d0fc8c100045594aa9894\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" podUID="a3dd58f4-c4d6-43dc-b9fa-78d464337376" Nov 25 15:22:03 crc kubenswrapper[4965]: E1125 15:22:03.202021 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" podUID="f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a" Nov 25 15:22:03 crc kubenswrapper[4965]: E1125 15:22:03.202100 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" podUID="cae07c7e-b337-46bb-8b04-06643ee9e6c3" Nov 25 15:22:03 crc kubenswrapper[4965]: E1125 15:22:03.204682 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" podUID="d781d742-fdc4-4480-90a3-6330b4add384" Nov 25 15:22:03 crc kubenswrapper[4965]: E1125 15:22:03.206784 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" podUID="b85374cc-6464-4dd6-9c38-0cabb8fd7834" Nov 25 15:22:03 crc kubenswrapper[4965]: I1125 15:22:03.673845 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:03 crc kubenswrapper[4965]: I1125 15:22:03.678654 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8bdaece-696d-4306-a66b-46c7333eb788-metrics-certs\") pod \"openstack-operator-controller-manager-68587559f4-7lqhq\" (UID: \"d8bdaece-696d-4306-a66b-46c7333eb788\") " pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:03 crc kubenswrapper[4965]: I1125 15:22:03.838267 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:17 crc kubenswrapper[4965]: E1125 15:22:17.984271 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:553b1288b330ad05771d59c6b73c1681c95f457e8475682f9ad0d2e6b85f37e9" Nov 25 15:22:17 crc kubenswrapper[4965]: E1125 15:22:17.986125 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:553b1288b330ad05771d59c6b73c1681c95f457e8475682f9ad0d2e6b85f37e9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zprtk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-79856dc55c-wt2wg_openstack-operators(764687bc-d3b6-47b3-96d8-8c31f47ab473): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:18 crc kubenswrapper[4965]: E1125 15:22:18.389831 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:d38faa9070da05487afdaa9e261ad39274c2ed862daf42efa460a040431f1991" Nov 25 15:22:18 crc kubenswrapper[4965]: E1125 15:22:18.390042 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:d38faa9070da05487afdaa9e261ad39274c2ed862daf42efa460a040431f1991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2zdrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-68b95954c9-kdff2_openstack-operators(579b7594-cdbd-4b63-9405-0321a133d2d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:19 crc kubenswrapper[4965]: E1125 15:22:19.879995 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b" Nov 25 15:22:19 crc kubenswrapper[4965]: E1125 15:22:19.880506 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-44p58,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-66cf5c67ff-fwc8g_openstack-operators(0952a381-bcc9-46de-bdac-bf2bdfe6ecc4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:20 crc kubenswrapper[4965]: E1125 15:22:20.900481 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04" Nov 25 15:22:20 crc kubenswrapper[4965]: E1125 15:22:20.900701 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tldd6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-cb6c4fdb7-j9tml_openstack-operators(af96aeb0-49ef-430d-9780-791c7a1b64da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:21 crc kubenswrapper[4965]: E1125 15:22:21.927571 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:b582189b55fddc180a6d468c9dba7078009a693db37b4093d4ba0c99ec675377" Nov 25 15:22:21 crc kubenswrapper[4965]: E1125 15:22:21.927743 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:b582189b55fddc180a6d468c9dba7078009a693db37b4093d4ba0c99ec675377,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-89rkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-5bfcdc958c-f6vxm_openstack-operators(7fca11c0-cc43-457e-a797-610c31c9bc7f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:22 crc kubenswrapper[4965]: E1125 15:22:22.453184 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13" Nov 25 15:22:22 crc kubenswrapper[4965]: E1125 15:22:22.453373 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kzgsz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-fd75fd47d-6hhqt_openstack-operators(09d0b7cc-6fc4-40dd-a332-b405d049e756): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:22 crc kubenswrapper[4965]: E1125 15:22:22.940500 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c" Nov 25 15:22:22 crc kubenswrapper[4965]: E1125 15:22:22.941023 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nqs4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5db546f9d9-7fzrj_openstack-operators(cd724754-7539-4700-9911-5c0ce503d70f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:32 crc kubenswrapper[4965]: E1125 15:22:32.681019 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd" Nov 25 15:22:32 crc kubenswrapper[4965]: E1125 15:22:32.682139 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9whjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5_openstack-operators(d781d742-fdc4-4480-90a3-6330b4add384): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:34 crc kubenswrapper[4965]: I1125 15:22:34.182418 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq"] Nov 25 15:22:35 crc kubenswrapper[4965]: W1125 15:22:35.253553 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8bdaece_696d_4306_a66b_46c7333eb788.slice/crio-16d245b4f574cf7beb1fa592c0c2bd1e768bd75350c80b8256903362c3ce5fed WatchSource:0}: Error finding container 16d245b4f574cf7beb1fa592c0c2bd1e768bd75350c80b8256903362c3ce5fed: Status 404 returned error can't find the container with id 16d245b4f574cf7beb1fa592c0c2bd1e768bd75350c80b8256903362c3ce5fed Nov 25 15:22:35 crc kubenswrapper[4965]: I1125 15:22:35.425653 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" event={"ID":"d8bdaece-696d-4306-a66b-46c7333eb788","Type":"ContainerStarted","Data":"16d245b4f574cf7beb1fa592c0c2bd1e768bd75350c80b8256903362c3ce5fed"} Nov 25 15:22:37 crc kubenswrapper[4965]: E1125 15:22:37.115048 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f" Nov 25 15:22:37 crc kubenswrapper[4965]: E1125 15:22:37.115222 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tkdh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-864885998-k6bww_openstack-operators(cae07c7e-b337-46bb-8b04-06643ee9e6c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:37 crc kubenswrapper[4965]: E1125 15:22:37.142339 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Nov 25 15:22:37 crc kubenswrapper[4965]: E1125 15:22:37.142498 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p9hbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-kckh6_openstack-operators(a5374ce4-8ac3-422b-9d62-9412dea697d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:37 crc kubenswrapper[4965]: E1125 15:22:37.143951 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" podUID="a5374ce4-8ac3-422b-9d62-9412dea697d3" Nov 25 15:22:38 crc kubenswrapper[4965]: E1125 15:22:38.363499 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 15:22:38 crc kubenswrapper[4965]: E1125 15:22:38.364218 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nqs4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5db546f9d9-7fzrj_openstack-operators(cd724754-7539-4700-9911-5c0ce503d70f): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 15:22:38 crc kubenswrapper[4965]: E1125 15:22:38.366268 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" podUID="cd724754-7539-4700-9911-5c0ce503d70f" Nov 25 15:22:38 crc kubenswrapper[4965]: I1125 15:22:38.475475 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" event={"ID":"90d4de2d-51f0-4b18-8272-905b733fc714","Type":"ContainerStarted","Data":"216331cf90b47303321e730f6206942b807df272282a662ad35af6d2e3b8b990"} Nov 25 15:22:38 crc kubenswrapper[4965]: I1125 15:22:38.484275 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:38 crc kubenswrapper[4965]: I1125 15:22:38.492549 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" event={"ID":"3ae7668e-2e54-482f-9340-8ffe413de1d1","Type":"ContainerStarted","Data":"9870c8b2280bda4faf032c107b5f101ca0d8bc355d94508411d141979d30c61a"} Nov 25 15:22:38 crc kubenswrapper[4965]: I1125 15:22:38.512353 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" event={"ID":"dbc985bf-ffef-456f-b4bd-37faeba9e8a1","Type":"ContainerStarted","Data":"450b3ba1096203e2e438c561bcca4f8ec7525c37a16db5b69e84b629ebc301ee"} Nov 25 15:22:38 crc kubenswrapper[4965]: I1125 15:22:38.515361 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" event={"ID":"756bbaba-31b3-4cd8-b3a6-6a3e0b805261","Type":"ContainerStarted","Data":"f77ba419cf4ba7065daa31daa2f56afbedf143575b0ce270b52641fdc6a8092c"} Nov 25 15:22:38 crc kubenswrapper[4965]: I1125 15:22:38.521740 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" event={"ID":"6493c01f-7b22-4a04-9b25-b17ad7c790a1","Type":"ContainerStarted","Data":"a5a13264fca13d518582863c8a36ca33487c17de40796632594da9f2d4c81119"} Nov 25 15:22:38 crc kubenswrapper[4965]: I1125 15:22:38.528266 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" event={"ID":"b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4","Type":"ContainerStarted","Data":"7e8ce303025c9d291de62299cb09be2997a680c824ca8d51ccdaca6da0df5962"} Nov 25 15:22:38 crc kubenswrapper[4965]: I1125 15:22:38.538545 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" podStartSLOduration=39.538527385 podStartE2EDuration="39.538527385s" podCreationTimestamp="2025-11-25 15:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:22:38.537562189 +0000 UTC m=+1103.505155935" watchObservedRunningTime="2025-11-25 15:22:38.538527385 +0000 UTC m=+1103.506121131" Nov 25 15:22:39 crc kubenswrapper[4965]: I1125 15:22:39.536813 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" event={"ID":"34390094-977c-4d9b-a9dd-8f4d4a5a89ad","Type":"ContainerStarted","Data":"f1a0b0ec2241db8346fe582322a2028671fac5abd51dafa20e84832af8fe77d6"} Nov 25 15:22:39 crc kubenswrapper[4965]: I1125 15:22:39.539236 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" event={"ID":"f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a","Type":"ContainerStarted","Data":"d0ea8f900bbcef0845cc3161ff2d75c0e2737bbca898f6d3d4c3b163b434adc0"} Nov 25 15:22:39 crc kubenswrapper[4965]: I1125 15:22:39.540730 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" event={"ID":"d8bdaece-696d-4306-a66b-46c7333eb788","Type":"ContainerStarted","Data":"57c2185826c22b1a9e1fafc6c036122474c2f2a1510884886882202f72a194b4"} Nov 25 15:22:41 crc kubenswrapper[4965]: I1125 15:22:41.565534 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" event={"ID":"b85374cc-6464-4dd6-9c38-0cabb8fd7834","Type":"ContainerStarted","Data":"11dc78ee9f32a22d4706569d0d04f4ee50c7782cbfd719c25e1d1ebc6b547199"} Nov 25 15:22:41 crc kubenswrapper[4965]: I1125 15:22:41.568712 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" event={"ID":"2b7be07d-fe11-494c-97b3-fa95b997450f","Type":"ContainerStarted","Data":"ff19de7b83eac07cb34681f52a3182c8104f01d2673e495c4c08055b309c6ab2"} Nov 25 15:22:41 crc kubenswrapper[4965]: I1125 15:22:41.574618 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" event={"ID":"a3dd58f4-c4d6-43dc-b9fa-78d464337376","Type":"ContainerStarted","Data":"685958e8b9534708ffb4454ca57b7235b2d23c40f0aa08b3bfe5b4e6b188d559"} Nov 25 15:22:42 crc kubenswrapper[4965]: E1125 15:22:42.026858 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 15:22:42 crc kubenswrapper[4965]: E1125 15:22:42.027063 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxshh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-567f98c9d-xxb4h_openstack-operators(31a52118-75f6-4e53-a6b6-fd6378c61df8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:22:42 crc kubenswrapper[4965]: E1125 15:22:42.028511 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" podUID="31a52118-75f6-4e53-a6b6-fd6378c61df8" Nov 25 15:22:42 crc kubenswrapper[4965]: E1125 15:22:42.451581 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" podUID="d781d742-fdc4-4480-90a3-6330b4add384" Nov 25 15:22:42 crc kubenswrapper[4965]: E1125 15:22:42.534108 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" podUID="af96aeb0-49ef-430d-9780-791c7a1b64da" Nov 25 15:22:42 crc kubenswrapper[4965]: I1125 15:22:42.598142 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" event={"ID":"af96aeb0-49ef-430d-9780-791c7a1b64da","Type":"ContainerStarted","Data":"e673b92ad6cfa1b0f7f96c191e8d2fb0cb095b222bc18c9f3f2e69465579b19a"} Nov 25 15:22:42 crc kubenswrapper[4965]: I1125 15:22:42.609797 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" event={"ID":"d781d742-fdc4-4480-90a3-6330b4add384","Type":"ContainerStarted","Data":"c9971c2d99e9b25f752f59a74cba8b2bf696dd59bd831fdc222133d7b4ac26d9"} Nov 25 15:22:42 crc kubenswrapper[4965]: I1125 15:22:42.614528 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" event={"ID":"dbc985bf-ffef-456f-b4bd-37faeba9e8a1","Type":"ContainerStarted","Data":"458cd5a0d28780290b3139c8c225c6a1e85e95ed96f1469ec514fe35b8efddef"} Nov 25 15:22:42 crc kubenswrapper[4965]: I1125 15:22:42.614633 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" Nov 25 15:22:42 crc kubenswrapper[4965]: E1125 15:22:42.619231 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" podUID="d781d742-fdc4-4480-90a3-6330b4add384" Nov 25 15:22:42 crc kubenswrapper[4965]: I1125 15:22:42.753567 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" podStartSLOduration=3.123755593 podStartE2EDuration="44.753544224s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:00.541451949 +0000 UTC m=+1065.509045695" lastFinishedPulling="2025-11-25 15:22:42.17124054 +0000 UTC m=+1107.138834326" observedRunningTime="2025-11-25 15:22:42.717528904 +0000 UTC m=+1107.685122680" watchObservedRunningTime="2025-11-25 15:22:42.753544224 +0000 UTC m=+1107.721137970" Nov 25 15:22:42 crc kubenswrapper[4965]: E1125 15:22:42.783485 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" podUID="764687bc-d3b6-47b3-96d8-8c31f47ab473" Nov 25 15:22:42 crc kubenswrapper[4965]: E1125 15:22:42.864819 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" podUID="09d0b7cc-6fc4-40dd-a332-b405d049e756" Nov 25 15:22:43 crc kubenswrapper[4965]: E1125 15:22:43.265984 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" podUID="cae07c7e-b337-46bb-8b04-06643ee9e6c3" Nov 25 15:22:43 crc kubenswrapper[4965]: E1125 15:22:43.367154 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" podUID="0952a381-bcc9-46de-bdac-bf2bdfe6ecc4" Nov 25 15:22:43 crc kubenswrapper[4965]: E1125 15:22:43.464599 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" podUID="579b7594-cdbd-4b63-9405-0321a133d2d0" Nov 25 15:22:43 crc kubenswrapper[4965]: E1125 15:22:43.467069 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" podUID="7fca11c0-cc43-457e-a797-610c31c9bc7f" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.631333 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" event={"ID":"f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a","Type":"ContainerStarted","Data":"b02f97cd09bd95b9e7df0ec3dc71c991fb4c66188bff2212739aacc53e50c9f0"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.632223 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.643788 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" event={"ID":"0952a381-bcc9-46de-bdac-bf2bdfe6ecc4","Type":"ContainerStarted","Data":"f8629717d3b29e4140ba2c86bad12894b51339445a150a3fafcb77c223d5c606"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.656268 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" event={"ID":"756bbaba-31b3-4cd8-b3a6-6a3e0b805261","Type":"ContainerStarted","Data":"3225b6b0c47fdcbc1c57a243415cc19cfee7c427b00cdc8ef76f722f0f9134c9"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.657209 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.669396 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.673069 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" event={"ID":"579b7594-cdbd-4b63-9405-0321a133d2d0","Type":"ContainerStarted","Data":"03bbc172966010bd74da0f1d11b18fe5f9a71e0a1630b06172457569e729ff9c"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.697322 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" event={"ID":"90d4de2d-51f0-4b18-8272-905b733fc714","Type":"ContainerStarted","Data":"142e7c44090e4a2e1fdd247bb013ec5f393201523d105915977e12cfa62c2666"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.697741 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.703942 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" podStartSLOduration=4.627376294 podStartE2EDuration="45.703922153s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.316204178 +0000 UTC m=+1066.283797924" lastFinishedPulling="2025-11-25 15:22:42.392750017 +0000 UTC m=+1107.360343783" observedRunningTime="2025-11-25 15:22:43.6913171 +0000 UTC m=+1108.658910836" watchObservedRunningTime="2025-11-25 15:22:43.703922153 +0000 UTC m=+1108.671515899" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.712659 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.720999 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" event={"ID":"764687bc-d3b6-47b3-96d8-8c31f47ab473","Type":"ContainerStarted","Data":"e67f16a47b86479d21abb3fe2b0a06e76cc8907da31457bf3b6ed1eb36ed124f"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.742687 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" event={"ID":"34390094-977c-4d9b-a9dd-8f4d4a5a89ad","Type":"ContainerStarted","Data":"074ec1642e1ccddae53384f44c8caa72ab2baeee43bf278b56b5326b27485ab3"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.743420 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.766753 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" event={"ID":"cd724754-7539-4700-9911-5c0ce503d70f","Type":"ContainerStarted","Data":"bf7af18d3b2245ee7356160e73f0a095d4c9949389d7085013ce3a3cbddff06f"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.766795 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" event={"ID":"cd724754-7539-4700-9911-5c0ce503d70f","Type":"ContainerStarted","Data":"03e57cacad9d393ed4b7cb25ea2f7080f5769d5241219e869cca7ceb8d8de5cf"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.767420 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.772258 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.788576 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" event={"ID":"7fca11c0-cc43-457e-a797-610c31c9bc7f","Type":"ContainerStarted","Data":"eeb6885cc5992ddaaceafe1ef04bc1631c42e867dfc9e8a6ee01174575f9efe5"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.811447 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" event={"ID":"3ae7668e-2e54-482f-9340-8ffe413de1d1","Type":"ContainerStarted","Data":"bbc9300510265d07888ded2abfd833205fe37ffe4ed4a7c8931ef16e7477421e"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.812722 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.835556 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.854380 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" event={"ID":"09d0b7cc-6fc4-40dd-a332-b405d049e756","Type":"ContainerStarted","Data":"2834a7dcef988f764853d4a52a108207ef60c6fb343a46517edeac1ce6b2f122"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.855930 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-68587559f4-7lqhq" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.888060 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" event={"ID":"b85374cc-6464-4dd6-9c38-0cabb8fd7834","Type":"ContainerStarted","Data":"9fc65d41b50080c0b196d812ba1c429054402fd62f8e0dec8f979bf6bb328119"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.889119 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.963224 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" event={"ID":"2b7be07d-fe11-494c-97b3-fa95b997450f","Type":"ContainerStarted","Data":"8bb0c6af0f0ed4d9d543d85cffe5da5012eb7e7fa14ff242e44422905b8ee61f"} Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.963808 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" Nov 25 15:22:43 crc kubenswrapper[4965]: I1125 15:22:43.998101 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" event={"ID":"cae07c7e-b337-46bb-8b04-06643ee9e6c3","Type":"ContainerStarted","Data":"84a5b6717161365161030e9ecc7a3728688b28957f138ca3beb354cc8668f9a9"} Nov 25 15:22:44 crc kubenswrapper[4965]: E1125 15:22:44.001197 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" podUID="cae07c7e-b337-46bb-8b04-06643ee9e6c3" Nov 25 15:22:44 crc kubenswrapper[4965]: I1125 15:22:44.022532 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-7nxmk" Nov 25 15:22:44 crc kubenswrapper[4965]: I1125 15:22:44.074311 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-v96z8" podStartSLOduration=4.695908999 podStartE2EDuration="46.074292361s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:00.843729382 +0000 UTC m=+1065.811323128" lastFinishedPulling="2025-11-25 15:22:42.222112734 +0000 UTC m=+1107.189706490" observedRunningTime="2025-11-25 15:22:44.034236251 +0000 UTC m=+1109.001829987" watchObservedRunningTime="2025-11-25 15:22:44.074292361 +0000 UTC m=+1109.041886097" Nov 25 15:22:44 crc kubenswrapper[4965]: I1125 15:22:44.293697 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" podStartSLOduration=5.203558285 podStartE2EDuration="46.29367804s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.244269094 +0000 UTC m=+1066.211862840" lastFinishedPulling="2025-11-25 15:22:42.334388839 +0000 UTC m=+1107.301982595" observedRunningTime="2025-11-25 15:22:44.292840988 +0000 UTC m=+1109.260434734" watchObservedRunningTime="2025-11-25 15:22:44.29367804 +0000 UTC m=+1109.261271786" Nov 25 15:22:44 crc kubenswrapper[4965]: I1125 15:22:44.344897 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-hg488" podStartSLOduration=3.784651719 podStartE2EDuration="45.344882914s" podCreationTimestamp="2025-11-25 15:21:59 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.183011859 +0000 UTC m=+1066.150605605" lastFinishedPulling="2025-11-25 15:22:42.743243054 +0000 UTC m=+1107.710836800" observedRunningTime="2025-11-25 15:22:44.341401819 +0000 UTC m=+1109.308995565" watchObservedRunningTime="2025-11-25 15:22:44.344882914 +0000 UTC m=+1109.312476660" Nov 25 15:22:44 crc kubenswrapper[4965]: I1125 15:22:44.436110 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-7k79p" podStartSLOduration=3.703674443 podStartE2EDuration="46.436088915s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:00.297106901 +0000 UTC m=+1065.264700647" lastFinishedPulling="2025-11-25 15:22:43.029521373 +0000 UTC m=+1107.997115119" observedRunningTime="2025-11-25 15:22:44.418494967 +0000 UTC m=+1109.386088713" watchObservedRunningTime="2025-11-25 15:22:44.436088915 +0000 UTC m=+1109.403682661" Nov 25 15:22:44 crc kubenswrapper[4965]: I1125 15:22:44.451524 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" podStartSLOduration=4.451765808 podStartE2EDuration="46.451503685s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:00.570141899 +0000 UTC m=+1065.537735645" lastFinishedPulling="2025-11-25 15:22:42.569879776 +0000 UTC m=+1107.537473522" observedRunningTime="2025-11-25 15:22:44.450282971 +0000 UTC m=+1109.417876717" watchObservedRunningTime="2025-11-25 15:22:44.451503685 +0000 UTC m=+1109.419097431" Nov 25 15:22:44 crc kubenswrapper[4965]: I1125 15:22:44.528308 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" podStartSLOduration=3.795818239 podStartE2EDuration="45.528294024s" podCreationTimestamp="2025-11-25 15:21:59 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.295707162 +0000 UTC m=+1066.263300908" lastFinishedPulling="2025-11-25 15:22:43.028182947 +0000 UTC m=+1107.995776693" observedRunningTime="2025-11-25 15:22:44.490376893 +0000 UTC m=+1109.457970639" watchObservedRunningTime="2025-11-25 15:22:44.528294024 +0000 UTC m=+1109.495887770" Nov 25 15:22:44 crc kubenswrapper[4965]: I1125 15:22:44.635513 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-24796" podStartSLOduration=4.255835942 podStartE2EDuration="46.635494331s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:00.652167597 +0000 UTC m=+1065.619761343" lastFinishedPulling="2025-11-25 15:22:43.031825986 +0000 UTC m=+1107.999419732" observedRunningTime="2025-11-25 15:22:44.630602558 +0000 UTC m=+1109.598196304" watchObservedRunningTime="2025-11-25 15:22:44.635494331 +0000 UTC m=+1109.603088077" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.018995 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" event={"ID":"6493c01f-7b22-4a04-9b25-b17ad7c790a1","Type":"ContainerStarted","Data":"060376c5705b5dbccf06213ba504acc3743a5f6bfee857afab610478567ed082"} Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.019325 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.021511 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.031242 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" event={"ID":"579b7594-cdbd-4b63-9405-0321a133d2d0","Type":"ContainerStarted","Data":"53814990a84ac90b7951e5e5824d21c8b8c2d5f622e0944f9caad3a7dc23dd37"} Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.031355 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.044097 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" event={"ID":"a3dd58f4-c4d6-43dc-b9fa-78d464337376","Type":"ContainerStarted","Data":"d582ffc809ebb43c583daf002f8945c9eb93928fcfb3202e2d1dce59ca300827"} Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.044779 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.054672 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" event={"ID":"09d0b7cc-6fc4-40dd-a332-b405d049e756","Type":"ContainerStarted","Data":"3609fe216180d9b7e92f83ae14925a01a0f0bac841a49d8a2825ddf9aba52daa"} Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.055394 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.058765 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.066497 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-77tlb" podStartSLOduration=4.337305914 podStartE2EDuration="47.066482208s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:00.805537335 +0000 UTC m=+1065.773131081" lastFinishedPulling="2025-11-25 15:22:43.534713629 +0000 UTC m=+1108.502307375" observedRunningTime="2025-11-25 15:22:45.036984526 +0000 UTC m=+1110.004578272" watchObservedRunningTime="2025-11-25 15:22:45.066482208 +0000 UTC m=+1110.034075954" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.078456 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" event={"ID":"b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4","Type":"ContainerStarted","Data":"dc0c981564f10f620ddc1c6e9346851fba33e4de48dc8bcd552c532fba5858b9"} Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.078752 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.086370 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.093879 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" event={"ID":"af96aeb0-49ef-430d-9780-791c7a1b64da","Type":"ContainerStarted","Data":"d39fabb9f342bd699ca0f4562141c6d60e4dc599fd0a52507297c7286a57f3e0"} Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.093938 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.105316 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" event={"ID":"7fca11c0-cc43-457e-a797-610c31c9bc7f","Type":"ContainerStarted","Data":"6815e610c86697273ec379e78aa608bf70bac32fa33651eb7fb06dc7c0b8f580"} Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.105436 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.117182 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" event={"ID":"764687bc-d3b6-47b3-96d8-8c31f47ab473","Type":"ContainerStarted","Data":"7232e8e3c0fbcf5486fa5c3d7fa906fb52183c532358ac9dbbacada96cb3bba2"} Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.117900 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.135097 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" event={"ID":"0952a381-bcc9-46de-bdac-bf2bdfe6ecc4","Type":"ContainerStarted","Data":"323801c46e6d3e5098c5ad78b5ba27a13dd0a2eded6b32e30edc61336315857a"} Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.135170 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.148522 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cb74df96-wmm8k" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.149530 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-8xprb" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.163082 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-cf9pl" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.163998 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" podStartSLOduration=3.120213775 podStartE2EDuration="47.163979641s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:00.354451309 +0000 UTC m=+1065.322045055" lastFinishedPulling="2025-11-25 15:22:44.398217185 +0000 UTC m=+1109.365810921" observedRunningTime="2025-11-25 15:22:45.158564964 +0000 UTC m=+1110.126158720" watchObservedRunningTime="2025-11-25 15:22:45.163979641 +0000 UTC m=+1110.131573387" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.198601 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" podStartSLOduration=3.365344428 podStartE2EDuration="47.198585932s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:00.507909248 +0000 UTC m=+1065.475502994" lastFinishedPulling="2025-11-25 15:22:44.341150752 +0000 UTC m=+1109.308744498" observedRunningTime="2025-11-25 15:22:45.197335009 +0000 UTC m=+1110.164928755" watchObservedRunningTime="2025-11-25 15:22:45.198585932 +0000 UTC m=+1110.166179678" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.216493 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" podStartSLOduration=3.758970932 podStartE2EDuration="47.21647598s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.183116562 +0000 UTC m=+1066.150710308" lastFinishedPulling="2025-11-25 15:22:44.64062161 +0000 UTC m=+1109.608215356" observedRunningTime="2025-11-25 15:22:45.213758646 +0000 UTC m=+1110.181352392" watchObservedRunningTime="2025-11-25 15:22:45.21647598 +0000 UTC m=+1110.184069726" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.272462 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" podStartSLOduration=4.600332827 podStartE2EDuration="47.272439372s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.112997107 +0000 UTC m=+1066.080590853" lastFinishedPulling="2025-11-25 15:22:43.785103652 +0000 UTC m=+1108.752697398" observedRunningTime="2025-11-25 15:22:45.247193976 +0000 UTC m=+1110.214787722" watchObservedRunningTime="2025-11-25 15:22:45.272439372 +0000 UTC m=+1110.240033118" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.302396 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" podStartSLOduration=3.795136474 podStartE2EDuration="47.302374546s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.194638215 +0000 UTC m=+1066.162231961" lastFinishedPulling="2025-11-25 15:22:44.701876287 +0000 UTC m=+1109.669470033" observedRunningTime="2025-11-25 15:22:45.277310694 +0000 UTC m=+1110.244904440" watchObservedRunningTime="2025-11-25 15:22:45.302374546 +0000 UTC m=+1110.269968302" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.357739 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" podStartSLOduration=3.788173973 podStartE2EDuration="47.357718352s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:00.774774409 +0000 UTC m=+1065.742368155" lastFinishedPulling="2025-11-25 15:22:44.344318788 +0000 UTC m=+1109.311912534" observedRunningTime="2025-11-25 15:22:45.351252927 +0000 UTC m=+1110.318846673" watchObservedRunningTime="2025-11-25 15:22:45.357718352 +0000 UTC m=+1110.325312098" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.502828 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-774b86978c-7fj92" podStartSLOduration=3.874905364 podStartE2EDuration="47.50280813s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:00.011250085 +0000 UTC m=+1064.978843831" lastFinishedPulling="2025-11-25 15:22:43.639152851 +0000 UTC m=+1108.606746597" observedRunningTime="2025-11-25 15:22:45.39033222 +0000 UTC m=+1110.357925966" watchObservedRunningTime="2025-11-25 15:22:45.50280813 +0000 UTC m=+1110.470401876" Nov 25 15:22:45 crc kubenswrapper[4965]: I1125 15:22:45.505849 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-d5dnx" podStartSLOduration=5.165475181 podStartE2EDuration="47.505828893s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.44806061 +0000 UTC m=+1066.415654356" lastFinishedPulling="2025-11-25 15:22:43.788414322 +0000 UTC m=+1108.756008068" observedRunningTime="2025-11-25 15:22:45.505484373 +0000 UTC m=+1110.473078119" watchObservedRunningTime="2025-11-25 15:22:45.505828893 +0000 UTC m=+1110.473422639" Nov 25 15:22:48 crc kubenswrapper[4965]: I1125 15:22:48.157456 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" event={"ID":"31a52118-75f6-4e53-a6b6-fd6378c61df8","Type":"ContainerStarted","Data":"8b1e6187c7d5c151586628a10aa8c68b33e7eea08ee24e5c358e927ebbd84e6c"} Nov 25 15:22:48 crc kubenswrapper[4965]: I1125 15:22:48.158146 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" event={"ID":"31a52118-75f6-4e53-a6b6-fd6378c61df8","Type":"ContainerStarted","Data":"4ddb286cc1b92dbe9aa14c5dc4ee479e825e942c7bcf7e7fe7081bcb758820a5"} Nov 25 15:22:48 crc kubenswrapper[4965]: I1125 15:22:48.158459 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" Nov 25 15:22:48 crc kubenswrapper[4965]: I1125 15:22:48.181522 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" podStartSLOduration=3.470037304 podStartE2EDuration="49.181500937s" podCreationTimestamp="2025-11-25 15:21:59 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.295470374 +0000 UTC m=+1066.263064120" lastFinishedPulling="2025-11-25 15:22:47.006933997 +0000 UTC m=+1111.974527753" observedRunningTime="2025-11-25 15:22:48.1764697 +0000 UTC m=+1113.144063446" watchObservedRunningTime="2025-11-25 15:22:48.181500937 +0000 UTC m=+1113.149094683" Nov 25 15:22:48 crc kubenswrapper[4965]: E1125 15:22:48.772800 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" podUID="a5374ce4-8ac3-422b-9d62-9412dea697d3" Nov 25 15:22:49 crc kubenswrapper[4965]: I1125 15:22:49.200410 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-6hhqt" Nov 25 15:22:49 crc kubenswrapper[4965]: I1125 15:22:49.339788 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-j9tml" Nov 25 15:22:49 crc kubenswrapper[4965]: I1125 15:22:49.419991 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-fwc8g" Nov 25 15:22:49 crc kubenswrapper[4965]: I1125 15:22:49.525585 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-7fzrj" Nov 25 15:22:53 crc kubenswrapper[4965]: I1125 15:22:53.261847 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:22:53 crc kubenswrapper[4965]: I1125 15:22:53.262184 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:22:56 crc kubenswrapper[4965]: E1125 15:22:56.781225 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" podUID="cae07c7e-b337-46bb-8b04-06643ee9e6c3" Nov 25 15:22:58 crc kubenswrapper[4965]: I1125 15:22:58.542745 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wt2wg" Nov 25 15:22:58 crc kubenswrapper[4965]: I1125 15:22:58.623473 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-kdff2" Nov 25 15:22:58 crc kubenswrapper[4965]: I1125 15:22:58.897408 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f6vxm" Nov 25 15:22:59 crc kubenswrapper[4965]: I1125 15:22:59.240043 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" event={"ID":"d781d742-fdc4-4480-90a3-6330b4add384","Type":"ContainerStarted","Data":"a03886130e5b08ebbb106cdc0e09f26d50d52cc4a80cfafab8cfd3a09fddb658"} Nov 25 15:22:59 crc kubenswrapper[4965]: I1125 15:22:59.240220 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:22:59 crc kubenswrapper[4965]: I1125 15:22:59.271682 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" podStartSLOduration=4.201983207 podStartE2EDuration="1m1.271665031s" podCreationTimestamp="2025-11-25 15:21:58 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.398527815 +0000 UTC m=+1066.366121561" lastFinishedPulling="2025-11-25 15:22:58.468209609 +0000 UTC m=+1123.435803385" observedRunningTime="2025-11-25 15:22:59.267817877 +0000 UTC m=+1124.235411623" watchObservedRunningTime="2025-11-25 15:22:59.271665031 +0000 UTC m=+1124.239258787" Nov 25 15:22:59 crc kubenswrapper[4965]: I1125 15:22:59.666267 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-xxb4h" Nov 25 15:23:05 crc kubenswrapper[4965]: I1125 15:23:05.285711 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" event={"ID":"a5374ce4-8ac3-422b-9d62-9412dea697d3","Type":"ContainerStarted","Data":"de59774e8929c04672238aff2096f40a441bea4ed1234e573d6bfed093ccfbb6"} Nov 25 15:23:05 crc kubenswrapper[4965]: I1125 15:23:05.317706 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kckh6" podStartSLOduration=3.546743846 podStartE2EDuration="1m6.317680671s" podCreationTimestamp="2025-11-25 15:21:59 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.415271409 +0000 UTC m=+1066.382865155" lastFinishedPulling="2025-11-25 15:23:04.186208224 +0000 UTC m=+1129.153801980" observedRunningTime="2025-11-25 15:23:05.313309453 +0000 UTC m=+1130.280903239" watchObservedRunningTime="2025-11-25 15:23:05.317680671 +0000 UTC m=+1130.285274467" Nov 25 15:23:09 crc kubenswrapper[4965]: I1125 15:23:09.318541 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" event={"ID":"cae07c7e-b337-46bb-8b04-06643ee9e6c3","Type":"ContainerStarted","Data":"d26cbd3c4c2d3572c55c1d4783375bfed13a42ec41fb033ae0a5aca1c6f9341b"} Nov 25 15:23:09 crc kubenswrapper[4965]: I1125 15:23:09.320209 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" Nov 25 15:23:09 crc kubenswrapper[4965]: I1125 15:23:09.336430 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" podStartSLOduration=3.070805373 podStartE2EDuration="1m10.33637886s" podCreationTimestamp="2025-11-25 15:21:59 +0000 UTC" firstStartedPulling="2025-11-25 15:22:01.280148929 +0000 UTC m=+1066.247742665" lastFinishedPulling="2025-11-25 15:23:08.545722396 +0000 UTC m=+1133.513316152" observedRunningTime="2025-11-25 15:23:09.336257606 +0000 UTC m=+1134.303851362" watchObservedRunningTime="2025-11-25 15:23:09.33637886 +0000 UTC m=+1134.303972626" Nov 25 15:23:10 crc kubenswrapper[4965]: I1125 15:23:10.876957 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5" Nov 25 15:23:19 crc kubenswrapper[4965]: I1125 15:23:19.932521 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-864885998-k6bww" Nov 25 15:23:23 crc kubenswrapper[4965]: I1125 15:23:23.260749 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:23:23 crc kubenswrapper[4965]: I1125 15:23:23.261113 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.406032 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vpsvc"] Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.408595 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.412166 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.416726 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-vqvd9" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.416996 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.417034 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.427823 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vpsvc"] Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.444001 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101a9e83-a4ed-43d5-a015-4c52a9fbe424-config\") pod \"dnsmasq-dns-675f4bcbfc-vpsvc\" (UID: \"101a9e83-a4ed-43d5-a015-4c52a9fbe424\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.444212 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fpm6\" (UniqueName: \"kubernetes.io/projected/101a9e83-a4ed-43d5-a015-4c52a9fbe424-kube-api-access-6fpm6\") pod \"dnsmasq-dns-675f4bcbfc-vpsvc\" (UID: \"101a9e83-a4ed-43d5-a015-4c52a9fbe424\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.483935 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w6qsn"] Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.485936 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.490254 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.528226 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w6qsn"] Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.549787 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fpm6\" (UniqueName: \"kubernetes.io/projected/101a9e83-a4ed-43d5-a015-4c52a9fbe424-kube-api-access-6fpm6\") pod \"dnsmasq-dns-675f4bcbfc-vpsvc\" (UID: \"101a9e83-a4ed-43d5-a015-4c52a9fbe424\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.549852 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w6qsn\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.549888 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-config\") pod \"dnsmasq-dns-78dd6ddcc-w6qsn\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.549909 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101a9e83-a4ed-43d5-a015-4c52a9fbe424-config\") pod \"dnsmasq-dns-675f4bcbfc-vpsvc\" (UID: \"101a9e83-a4ed-43d5-a015-4c52a9fbe424\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.550044 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzdcr\" (UniqueName: \"kubernetes.io/projected/5e810938-3cb6-4551-9568-ddd4a4828ed6-kube-api-access-wzdcr\") pod \"dnsmasq-dns-78dd6ddcc-w6qsn\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.551424 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101a9e83-a4ed-43d5-a015-4c52a9fbe424-config\") pod \"dnsmasq-dns-675f4bcbfc-vpsvc\" (UID: \"101a9e83-a4ed-43d5-a015-4c52a9fbe424\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.594076 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fpm6\" (UniqueName: \"kubernetes.io/projected/101a9e83-a4ed-43d5-a015-4c52a9fbe424-kube-api-access-6fpm6\") pod \"dnsmasq-dns-675f4bcbfc-vpsvc\" (UID: \"101a9e83-a4ed-43d5-a015-4c52a9fbe424\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.651837 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzdcr\" (UniqueName: \"kubernetes.io/projected/5e810938-3cb6-4551-9568-ddd4a4828ed6-kube-api-access-wzdcr\") pod \"dnsmasq-dns-78dd6ddcc-w6qsn\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.651915 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w6qsn\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.651948 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-config\") pod \"dnsmasq-dns-78dd6ddcc-w6qsn\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.652850 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-config\") pod \"dnsmasq-dns-78dd6ddcc-w6qsn\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.652860 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w6qsn\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.693827 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzdcr\" (UniqueName: \"kubernetes.io/projected/5e810938-3cb6-4551-9568-ddd4a4828ed6-kube-api-access-wzdcr\") pod \"dnsmasq-dns-78dd6ddcc-w6qsn\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.728656 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" Nov 25 15:23:34 crc kubenswrapper[4965]: I1125 15:23:34.806496 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:23:35 crc kubenswrapper[4965]: I1125 15:23:35.065749 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vpsvc"] Nov 25 15:23:35 crc kubenswrapper[4965]: I1125 15:23:35.369653 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w6qsn"] Nov 25 15:23:35 crc kubenswrapper[4965]: W1125 15:23:35.372218 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e810938_3cb6_4551_9568_ddd4a4828ed6.slice/crio-e1d9d8ced7e4eda5fa6824c1466c907cbdcf3a6b0c8a0a945bee9f1199f813bd WatchSource:0}: Error finding container e1d9d8ced7e4eda5fa6824c1466c907cbdcf3a6b0c8a0a945bee9f1199f813bd: Status 404 returned error can't find the container with id e1d9d8ced7e4eda5fa6824c1466c907cbdcf3a6b0c8a0a945bee9f1199f813bd Nov 25 15:23:35 crc kubenswrapper[4965]: I1125 15:23:35.551334 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" event={"ID":"101a9e83-a4ed-43d5-a015-4c52a9fbe424","Type":"ContainerStarted","Data":"48386f11544cf6564aeabcff03934aa8a9d21fb614b3d1a440e376a4afc548bd"} Nov 25 15:23:35 crc kubenswrapper[4965]: I1125 15:23:35.552952 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" event={"ID":"5e810938-3cb6-4551-9568-ddd4a4828ed6","Type":"ContainerStarted","Data":"e1d9d8ced7e4eda5fa6824c1466c907cbdcf3a6b0c8a0a945bee9f1199f813bd"} Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.270078 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vpsvc"] Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.297849 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-57jrf"] Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.298883 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.365948 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-57jrf"] Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.488939 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-config\") pod \"dnsmasq-dns-666b6646f7-57jrf\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.489013 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rspx\" (UniqueName: \"kubernetes.io/projected/21e02d79-ae37-4b46-a428-62ba4fa40856-kube-api-access-6rspx\") pod \"dnsmasq-dns-666b6646f7-57jrf\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.489038 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-dns-svc\") pod \"dnsmasq-dns-666b6646f7-57jrf\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.593877 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-config\") pod \"dnsmasq-dns-666b6646f7-57jrf\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.593955 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rspx\" (UniqueName: \"kubernetes.io/projected/21e02d79-ae37-4b46-a428-62ba4fa40856-kube-api-access-6rspx\") pod \"dnsmasq-dns-666b6646f7-57jrf\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.593999 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-dns-svc\") pod \"dnsmasq-dns-666b6646f7-57jrf\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.595378 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-dns-svc\") pod \"dnsmasq-dns-666b6646f7-57jrf\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.595606 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-config\") pod \"dnsmasq-dns-666b6646f7-57jrf\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.627904 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w6qsn"] Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.633827 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rspx\" (UniqueName: \"kubernetes.io/projected/21e02d79-ae37-4b46-a428-62ba4fa40856-kube-api-access-6rspx\") pod \"dnsmasq-dns-666b6646f7-57jrf\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.682167 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.683112 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4m6dq"] Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.684750 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.697293 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-config\") pod \"dnsmasq-dns-57d769cc4f-4m6dq\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.697362 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4m6dq\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.697391 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tktqd\" (UniqueName: \"kubernetes.io/projected/ffd25d05-24d3-4719-87c0-64f16eb4ae50-kube-api-access-tktqd\") pod \"dnsmasq-dns-57d769cc4f-4m6dq\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.716929 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4m6dq"] Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.799006 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tktqd\" (UniqueName: \"kubernetes.io/projected/ffd25d05-24d3-4719-87c0-64f16eb4ae50-kube-api-access-tktqd\") pod \"dnsmasq-dns-57d769cc4f-4m6dq\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.799076 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-config\") pod \"dnsmasq-dns-57d769cc4f-4m6dq\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.799145 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4m6dq\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.799931 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4m6dq\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.800196 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-config\") pod \"dnsmasq-dns-57d769cc4f-4m6dq\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:37 crc kubenswrapper[4965]: I1125 15:23:37.858326 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tktqd\" (UniqueName: \"kubernetes.io/projected/ffd25d05-24d3-4719-87c0-64f16eb4ae50-kube-api-access-tktqd\") pod \"dnsmasq-dns-57d769cc4f-4m6dq\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.016354 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.327879 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-57jrf"] Nov 25 15:23:38 crc kubenswrapper[4965]: W1125 15:23:38.343188 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e02d79_ae37_4b46_a428_62ba4fa40856.slice/crio-fd4b497efe30b0a340f435142ea03015e964e0d1a693b28d5ba3129e78fade18 WatchSource:0}: Error finding container fd4b497efe30b0a340f435142ea03015e964e0d1a693b28d5ba3129e78fade18: Status 404 returned error can't find the container with id fd4b497efe30b0a340f435142ea03015e964e0d1a693b28d5ba3129e78fade18 Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.515405 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.517029 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.522436 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.522572 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.522621 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.522738 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.522751 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.522833 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.522932 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gsqfl" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.557487 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.586424 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-57jrf" event={"ID":"21e02d79-ae37-4b46-a428-62ba4fa40856","Type":"ContainerStarted","Data":"fd4b497efe30b0a340f435142ea03015e964e0d1a693b28d5ba3129e78fade18"} Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.658612 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4m6dq"] Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716278 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716342 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/739d03f5-20b2-4c12-9f3e-fbe795ec890d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716369 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-config-data\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716384 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716401 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/739d03f5-20b2-4c12-9f3e-fbe795ec890d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716418 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716491 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716508 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716546 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716575 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22smk\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-kube-api-access-22smk\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.716598 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817386 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22smk\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-kube-api-access-22smk\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817427 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817470 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817497 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/739d03f5-20b2-4c12-9f3e-fbe795ec890d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817522 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-config-data\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817541 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/739d03f5-20b2-4c12-9f3e-fbe795ec890d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817560 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817583 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817608 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817623 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.817671 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.819021 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-config-data\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.819402 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.819585 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.820202 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.821884 4965 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.824051 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.844827 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/739d03f5-20b2-4c12-9f3e-fbe795ec890d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.844813 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.855304 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/739d03f5-20b2-4c12-9f3e-fbe795ec890d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.855765 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.856017 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22smk\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-kube-api-access-22smk\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.900507 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " pod="openstack/rabbitmq-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.904711 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.906039 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.913262 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.913906 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.914150 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.918051 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.918250 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.919196 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.919618 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t5nv5" Nov 25 15:23:38 crc kubenswrapper[4965]: I1125 15:23:38.930706 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.021786 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.021844 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.021887 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.021916 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.021942 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.022029 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.022059 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.022083 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.022121 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.022202 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.022236 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hcd4\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-kube-api-access-2hcd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124017 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124127 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124167 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hcd4\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-kube-api-access-2hcd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124196 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124224 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124255 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124279 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124307 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124357 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124387 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.124414 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.126512 4965 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.126741 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.127646 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.130936 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.131774 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.132131 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.136544 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.139011 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.140615 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.142822 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.143785 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.168469 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hcd4\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-kube-api-access-2hcd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.168674 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.250574 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.615040 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" event={"ID":"ffd25d05-24d3-4719-87c0-64f16eb4ae50","Type":"ContainerStarted","Data":"8489133a6dc0756ce299147a8b991c59982b1638ab26b753026d71da4b5a7677"} Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.638240 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 15:23:39 crc kubenswrapper[4965]: W1125 15:23:39.674384 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67d0186d_7eca_48a0_9cc8_56ce4d1caa38.slice/crio-b385bcb1fb02059d7fe7559d4667b942eff49f4849a5c74c43f69d9ecefe4024 WatchSource:0}: Error finding container b385bcb1fb02059d7fe7559d4667b942eff49f4849a5c74c43f69d9ecefe4024: Status 404 returned error can't find the container with id b385bcb1fb02059d7fe7559d4667b942eff49f4849a5c74c43f69d9ecefe4024 Nov 25 15:23:39 crc kubenswrapper[4965]: I1125 15:23:39.772906 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.314815 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.319145 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.321781 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-d4l6d" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.323709 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.329086 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.330439 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.336640 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.340084 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.446066 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/362085ca-1948-4f56-8add-3e727c63e58e-config-data-default\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.446117 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/362085ca-1948-4f56-8add-3e727c63e58e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.446201 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/362085ca-1948-4f56-8add-3e727c63e58e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.446226 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfv77\" (UniqueName: \"kubernetes.io/projected/362085ca-1948-4f56-8add-3e727c63e58e-kube-api-access-wfv77\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.446245 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362085ca-1948-4f56-8add-3e727c63e58e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.446265 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.446290 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/362085ca-1948-4f56-8add-3e727c63e58e-kolla-config\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.446783 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362085ca-1948-4f56-8add-3e727c63e58e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.548000 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/362085ca-1948-4f56-8add-3e727c63e58e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.548097 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/362085ca-1948-4f56-8add-3e727c63e58e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.548129 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfv77\" (UniqueName: \"kubernetes.io/projected/362085ca-1948-4f56-8add-3e727c63e58e-kube-api-access-wfv77\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.548163 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362085ca-1948-4f56-8add-3e727c63e58e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.548189 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.548227 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/362085ca-1948-4f56-8add-3e727c63e58e-kolla-config\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.548267 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362085ca-1948-4f56-8add-3e727c63e58e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.548322 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/362085ca-1948-4f56-8add-3e727c63e58e-config-data-default\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.549945 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/362085ca-1948-4f56-8add-3e727c63e58e-config-data-default\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.551200 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/362085ca-1948-4f56-8add-3e727c63e58e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.551941 4965 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.551932 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/362085ca-1948-4f56-8add-3e727c63e58e-kolla-config\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.553772 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362085ca-1948-4f56-8add-3e727c63e58e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.557835 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/362085ca-1948-4f56-8add-3e727c63e58e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.571591 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfv77\" (UniqueName: \"kubernetes.io/projected/362085ca-1948-4f56-8add-3e727c63e58e-kube-api-access-wfv77\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.573237 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362085ca-1948-4f56-8add-3e727c63e58e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.603187 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"362085ca-1948-4f56-8add-3e727c63e58e\") " pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.651731 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.666271 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"739d03f5-20b2-4c12-9f3e-fbe795ec890d","Type":"ContainerStarted","Data":"3a341c587c33d219d03462d7a93fce9cc17d17e674be7d4e971b45b8ab0aa777"} Nov 25 15:23:40 crc kubenswrapper[4965]: I1125 15:23:40.670763 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"67d0186d-7eca-48a0-9cc8-56ce4d1caa38","Type":"ContainerStarted","Data":"b385bcb1fb02059d7fe7559d4667b942eff49f4849a5c74c43f69d9ecefe4024"} Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.276417 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.709169 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"362085ca-1948-4f56-8add-3e727c63e58e","Type":"ContainerStarted","Data":"4e59ea27a6d7e58d1f8a88c061476c3f108f2b619fba7fa49bb26f0925081f5a"} Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.744779 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.747829 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.762938 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2h98p" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.771068 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.771386 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.771596 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.797864 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.879759 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.879798 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.879821 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrv64\" (UniqueName: \"kubernetes.io/projected/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-kube-api-access-qrv64\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.879852 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.879892 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.879939 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.879955 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.879992 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.984103 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.984142 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.984168 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrv64\" (UniqueName: \"kubernetes.io/projected/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-kube-api-access-qrv64\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.984187 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.984217 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.984267 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.984284 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.984301 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.984596 4965 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.991722 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.991883 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.992230 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.992885 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:41 crc kubenswrapper[4965]: I1125 15:23:41.999875 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.010657 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.016600 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrv64\" (UniqueName: \"kubernetes.io/projected/11c417a7-1f7b-42c4-ba2d-e221bdf95f9f-kube-api-access-qrv64\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.027481 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f\") " pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.112041 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.220116 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.221475 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.239243 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.239440 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.239657 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-lj97z" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.241140 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.302594 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d840d31-e83e-45b7-9863-1e747d7a1290-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.302636 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhv7d\" (UniqueName: \"kubernetes.io/projected/3d840d31-e83e-45b7-9863-1e747d7a1290-kube-api-access-dhv7d\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.302677 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d840d31-e83e-45b7-9863-1e747d7a1290-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.302717 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d840d31-e83e-45b7-9863-1e747d7a1290-kolla-config\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.302734 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d840d31-e83e-45b7-9863-1e747d7a1290-config-data\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.405104 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d840d31-e83e-45b7-9863-1e747d7a1290-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.405159 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhv7d\" (UniqueName: \"kubernetes.io/projected/3d840d31-e83e-45b7-9863-1e747d7a1290-kube-api-access-dhv7d\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.405218 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d840d31-e83e-45b7-9863-1e747d7a1290-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.405260 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d840d31-e83e-45b7-9863-1e747d7a1290-kolla-config\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.405280 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d840d31-e83e-45b7-9863-1e747d7a1290-config-data\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.406402 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d840d31-e83e-45b7-9863-1e747d7a1290-kolla-config\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.406600 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d840d31-e83e-45b7-9863-1e747d7a1290-config-data\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.408467 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d840d31-e83e-45b7-9863-1e747d7a1290-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.409352 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d840d31-e83e-45b7-9863-1e747d7a1290-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.456496 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhv7d\" (UniqueName: \"kubernetes.io/projected/3d840d31-e83e-45b7-9863-1e747d7a1290-kube-api-access-dhv7d\") pod \"memcached-0\" (UID: \"3d840d31-e83e-45b7-9863-1e747d7a1290\") " pod="openstack/memcached-0" Nov 25 15:23:42 crc kubenswrapper[4965]: I1125 15:23:42.562082 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 25 15:23:43 crc kubenswrapper[4965]: I1125 15:23:43.198921 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 15:23:43 crc kubenswrapper[4965]: I1125 15:23:43.459864 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 25 15:23:43 crc kubenswrapper[4965]: I1125 15:23:43.777044 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3d840d31-e83e-45b7-9863-1e747d7a1290","Type":"ContainerStarted","Data":"884808d39826873c3dbf9310a6067466d3fa5d695ad5c934052f36ffad20cf5e"} Nov 25 15:23:43 crc kubenswrapper[4965]: I1125 15:23:43.778653 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f","Type":"ContainerStarted","Data":"38de753484c8466d2c7a931ac79c7b4331e8021c83feee3bfe6fa49cecb8508d"} Nov 25 15:23:44 crc kubenswrapper[4965]: I1125 15:23:44.372114 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 15:23:44 crc kubenswrapper[4965]: I1125 15:23:44.373043 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 15:23:44 crc kubenswrapper[4965]: I1125 15:23:44.378151 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-89p6g" Nov 25 15:23:44 crc kubenswrapper[4965]: I1125 15:23:44.384039 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 15:23:44 crc kubenswrapper[4965]: I1125 15:23:44.497842 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n8qq\" (UniqueName: \"kubernetes.io/projected/8bccea83-ab65-40e5-943f-f35e98b7618c-kube-api-access-4n8qq\") pod \"kube-state-metrics-0\" (UID: \"8bccea83-ab65-40e5-943f-f35e98b7618c\") " pod="openstack/kube-state-metrics-0" Nov 25 15:23:44 crc kubenswrapper[4965]: I1125 15:23:44.614539 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n8qq\" (UniqueName: \"kubernetes.io/projected/8bccea83-ab65-40e5-943f-f35e98b7618c-kube-api-access-4n8qq\") pod \"kube-state-metrics-0\" (UID: \"8bccea83-ab65-40e5-943f-f35e98b7618c\") " pod="openstack/kube-state-metrics-0" Nov 25 15:23:44 crc kubenswrapper[4965]: I1125 15:23:44.652192 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n8qq\" (UniqueName: \"kubernetes.io/projected/8bccea83-ab65-40e5-943f-f35e98b7618c-kube-api-access-4n8qq\") pod \"kube-state-metrics-0\" (UID: \"8bccea83-ab65-40e5-943f-f35e98b7618c\") " pod="openstack/kube-state-metrics-0" Nov 25 15:23:44 crc kubenswrapper[4965]: I1125 15:23:44.694772 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.628429 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.631432 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.634897 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.634999 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-t7f6d" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.635184 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.635291 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.635456 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.653510 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.691828 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929774db-0294-4631-b00e-1b664c1d4cba-config\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.691927 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.691956 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/929774db-0294-4631-b00e-1b664c1d4cba-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.692070 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg7gv\" (UniqueName: \"kubernetes.io/projected/929774db-0294-4631-b00e-1b664c1d4cba-kube-api-access-fg7gv\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.692122 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/929774db-0294-4631-b00e-1b664c1d4cba-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.692149 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/929774db-0294-4631-b00e-1b664c1d4cba-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.693443 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929774db-0294-4631-b00e-1b664c1d4cba-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.695094 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/929774db-0294-4631-b00e-1b664c1d4cba-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.796643 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/929774db-0294-4631-b00e-1b664c1d4cba-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.796728 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/929774db-0294-4631-b00e-1b664c1d4cba-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.796753 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929774db-0294-4631-b00e-1b664c1d4cba-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.796782 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/929774db-0294-4631-b00e-1b664c1d4cba-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.796854 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929774db-0294-4631-b00e-1b664c1d4cba-config\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.796910 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.796945 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/929774db-0294-4631-b00e-1b664c1d4cba-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.797032 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg7gv\" (UniqueName: \"kubernetes.io/projected/929774db-0294-4631-b00e-1b664c1d4cba-kube-api-access-fg7gv\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.798687 4965 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.799466 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929774db-0294-4631-b00e-1b664c1d4cba-config\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.799792 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/929774db-0294-4631-b00e-1b664c1d4cba-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.800144 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/929774db-0294-4631-b00e-1b664c1d4cba-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.803634 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/929774db-0294-4631-b00e-1b664c1d4cba-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.830839 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929774db-0294-4631-b00e-1b664c1d4cba-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.832134 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/929774db-0294-4631-b00e-1b664c1d4cba-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.839468 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg7gv\" (UniqueName: \"kubernetes.io/projected/929774db-0294-4631-b00e-1b664c1d4cba-kube-api-access-fg7gv\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.887649 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"929774db-0294-4631-b00e-1b664c1d4cba\") " pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:47 crc kubenswrapper[4965]: I1125 15:23:47.972672 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.343982 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wscwk"] Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.347414 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.350935 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ws9b7" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.351009 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.351146 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.362915 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wscwk"] Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.408030 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bwsx2"] Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.409746 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.417065 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-combined-ca-bundle\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.417112 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-scripts\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.417141 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-var-log-ovn\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.417175 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-var-run-ovn\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.417214 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-ovn-controller-tls-certs\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.417244 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-var-run\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.417270 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jsbv\" (UniqueName: \"kubernetes.io/projected/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-kube-api-access-7jsbv\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.422203 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bwsx2"] Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.518937 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-var-log-ovn\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519280 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-var-run-ovn\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519407 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63b33f63-d2e7-48cd-92e3-f47404184ba9-scripts\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519542 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-etc-ovs\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519574 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-var-log-ovn\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519673 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-ovn-controller-tls-certs\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519698 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-var-run-ovn\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519752 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ljvh\" (UniqueName: \"kubernetes.io/projected/63b33f63-d2e7-48cd-92e3-f47404184ba9-kube-api-access-9ljvh\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519786 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-var-run\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519832 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jsbv\" (UniqueName: \"kubernetes.io/projected/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-kube-api-access-7jsbv\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519866 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-var-log\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.519939 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-var-run\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.520020 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-var-run\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.520172 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-combined-ca-bundle\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.520260 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-scripts\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.520286 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-var-lib\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.522505 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-scripts\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.525200 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-ovn-controller-tls-certs\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.527540 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-combined-ca-bundle\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.542990 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jsbv\" (UniqueName: \"kubernetes.io/projected/b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed-kube-api-access-7jsbv\") pod \"ovn-controller-wscwk\" (UID: \"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed\") " pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.623674 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ljvh\" (UniqueName: \"kubernetes.io/projected/63b33f63-d2e7-48cd-92e3-f47404184ba9-kube-api-access-9ljvh\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.623745 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-var-log\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.623775 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-var-run\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.623806 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-var-lib\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.623859 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63b33f63-d2e7-48cd-92e3-f47404184ba9-scripts\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.623874 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-etc-ovs\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.624103 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-var-log\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.624189 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-etc-ovs\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.624209 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-var-lib\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.624295 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63b33f63-d2e7-48cd-92e3-f47404184ba9-var-run\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.625828 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63b33f63-d2e7-48cd-92e3-f47404184ba9-scripts\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.642876 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ljvh\" (UniqueName: \"kubernetes.io/projected/63b33f63-d2e7-48cd-92e3-f47404184ba9-kube-api-access-9ljvh\") pod \"ovn-controller-ovs-bwsx2\" (UID: \"63b33f63-d2e7-48cd-92e3-f47404184ba9\") " pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.667305 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wscwk" Nov 25 15:23:48 crc kubenswrapper[4965]: I1125 15:23:48.745285 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.595674 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.597562 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.602415 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.602843 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.603090 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-h54vf" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.603093 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.612460 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.700320 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.700387 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.700444 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.700467 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.700528 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwrx4\" (UniqueName: \"kubernetes.io/projected/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-kube-api-access-wwrx4\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.700549 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.700563 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-config\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.700589 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.803272 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwrx4\" (UniqueName: \"kubernetes.io/projected/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-kube-api-access-wwrx4\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.803324 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.803343 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-config\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.803371 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.803405 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.803432 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.803506 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.803528 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.804162 4965 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.804527 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.805263 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-config\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.805624 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.813653 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.813800 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.821189 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.828698 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwrx4\" (UniqueName: \"kubernetes.io/projected/8ea3b8e7-7b5d-46e0-b07d-33db65d5305d-kube-api-access-wwrx4\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.851477 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d\") " pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:51 crc kubenswrapper[4965]: I1125 15:23:51.916035 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 25 15:23:53 crc kubenswrapper[4965]: I1125 15:23:53.260508 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:23:53 crc kubenswrapper[4965]: I1125 15:23:53.261873 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:23:53 crc kubenswrapper[4965]: I1125 15:23:53.262095 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:23:53 crc kubenswrapper[4965]: I1125 15:23:53.262826 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be8cabf8c298dce6dc5c47e109690923bbdb10ab8f0bdbfa1738209ba0e27a1b"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:23:53 crc kubenswrapper[4965]: I1125 15:23:53.263013 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://be8cabf8c298dce6dc5c47e109690923bbdb10ab8f0bdbfa1738209ba0e27a1b" gracePeriod=600 Nov 25 15:23:53 crc kubenswrapper[4965]: I1125 15:23:53.926888 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="be8cabf8c298dce6dc5c47e109690923bbdb10ab8f0bdbfa1738209ba0e27a1b" exitCode=0 Nov 25 15:23:53 crc kubenswrapper[4965]: I1125 15:23:53.926919 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"be8cabf8c298dce6dc5c47e109690923bbdb10ab8f0bdbfa1738209ba0e27a1b"} Nov 25 15:23:53 crc kubenswrapper[4965]: I1125 15:23:53.926985 4965 scope.go:117] "RemoveContainer" containerID="00ca3c30c6342c0ded628729d3f70a02171e1d4a4c62216224c37d3f6ce21240" Nov 25 15:24:18 crc kubenswrapper[4965]: E1125 15:24:18.772655 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Nov 25 15:24:18 crc kubenswrapper[4965]: E1125 15:24:18.773547 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-22smk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(739d03f5-20b2-4c12-9f3e-fbe795ec890d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:18 crc kubenswrapper[4965]: E1125 15:24:18.777017 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" Nov 25 15:24:19 crc kubenswrapper[4965]: E1125 15:24:19.109354 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" Nov 25 15:24:19 crc kubenswrapper[4965]: E1125 15:24:19.338222 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Nov 25 15:24:19 crc kubenswrapper[4965]: E1125 15:24:19.338391 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2hcd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(67d0186d-7eca-48a0-9cc8-56ce4d1caa38): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:19 crc kubenswrapper[4965]: E1125 15:24:19.339492 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" Nov 25 15:24:20 crc kubenswrapper[4965]: E1125 15:24:20.116365 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" Nov 25 15:24:21 crc kubenswrapper[4965]: E1125 15:24:21.170073 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Nov 25 15:24:21 crc kubenswrapper[4965]: E1125 15:24:21.170517 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wfv77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(362085ca-1948-4f56-8add-3e727c63e58e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:21 crc kubenswrapper[4965]: E1125 15:24:21.172131 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="362085ca-1948-4f56-8add-3e727c63e58e" Nov 25 15:24:21 crc kubenswrapper[4965]: E1125 15:24:21.180106 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Nov 25 15:24:21 crc kubenswrapper[4965]: E1125 15:24:21.180327 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrv64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(11c417a7-1f7b-42c4-ba2d-e221bdf95f9f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:21 crc kubenswrapper[4965]: E1125 15:24:21.181492 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="11c417a7-1f7b-42c4-ba2d-e221bdf95f9f" Nov 25 15:24:21 crc kubenswrapper[4965]: E1125 15:24:21.876490 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Nov 25 15:24:21 crc kubenswrapper[4965]: E1125 15:24:21.876696 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n598h67chcbh669hchffh59fh74h56dh58ch87h5c7h685h566h5b7h8hcdh64fh55fh568hb4h76h558h58hc6h556h5bh8ch5ch8dh56dh5c9q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhv7d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(3d840d31-e83e-45b7-9863-1e747d7a1290): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:21 crc kubenswrapper[4965]: E1125 15:24:21.878163 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="3d840d31-e83e-45b7-9863-1e747d7a1290" Nov 25 15:24:22 crc kubenswrapper[4965]: E1125 15:24:22.130548 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="11c417a7-1f7b-42c4-ba2d-e221bdf95f9f" Nov 25 15:24:22 crc kubenswrapper[4965]: E1125 15:24:22.130821 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="3d840d31-e83e-45b7-9863-1e747d7a1290" Nov 25 15:24:22 crc kubenswrapper[4965]: E1125 15:24:22.131081 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="362085ca-1948-4f56-8add-3e727c63e58e" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.068986 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.069326 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wzdcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-w6qsn_openstack(5e810938-3cb6-4551-9568-ddd4a4828ed6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.070632 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" podUID="5e810938-3cb6-4551-9568-ddd4a4828ed6" Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.241739 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wscwk"] Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.259214 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.259403 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6fpm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-vpsvc_openstack(101a9e83-a4ed-43d5-a015-4c52a9fbe424): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.260483 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" podUID="101a9e83-a4ed-43d5-a015-4c52a9fbe424" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.297232 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.297440 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6rspx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-57jrf_openstack(21e02d79-ae37-4b46-a428-62ba4fa40856): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.298674 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-57jrf" podUID="21e02d79-ae37-4b46-a428-62ba4fa40856" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.381510 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.381913 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tktqd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-4m6dq_openstack(ffd25d05-24d3-4719-87c0-64f16eb4ae50): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:23 crc kubenswrapper[4965]: E1125 15:24:23.383184 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" podUID="ffd25d05-24d3-4719-87c0-64f16eb4ae50" Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.430664 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.468152 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.579167 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-config\") pod \"5e810938-3cb6-4551-9568-ddd4a4828ed6\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.579235 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzdcr\" (UniqueName: \"kubernetes.io/projected/5e810938-3cb6-4551-9568-ddd4a4828ed6-kube-api-access-wzdcr\") pod \"5e810938-3cb6-4551-9568-ddd4a4828ed6\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.579399 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-dns-svc\") pod \"5e810938-3cb6-4551-9568-ddd4a4828ed6\" (UID: \"5e810938-3cb6-4551-9568-ddd4a4828ed6\") " Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.579848 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-config" (OuterVolumeSpecName: "config") pod "5e810938-3cb6-4551-9568-ddd4a4828ed6" (UID: "5e810938-3cb6-4551-9568-ddd4a4828ed6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.580368 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e810938-3cb6-4551-9568-ddd4a4828ed6" (UID: "5e810938-3cb6-4551-9568-ddd4a4828ed6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.585878 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e810938-3cb6-4551-9568-ddd4a4828ed6-kube-api-access-wzdcr" (OuterVolumeSpecName: "kube-api-access-wzdcr") pod "5e810938-3cb6-4551-9568-ddd4a4828ed6" (UID: "5e810938-3cb6-4551-9568-ddd4a4828ed6"). InnerVolumeSpecName "kube-api-access-wzdcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.682094 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.682141 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzdcr\" (UniqueName: \"kubernetes.io/projected/5e810938-3cb6-4551-9568-ddd4a4828ed6-kube-api-access-wzdcr\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:23 crc kubenswrapper[4965]: I1125 15:24:23.682156 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e810938-3cb6-4551-9568-ddd4a4828ed6-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.046070 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.152078 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8bccea83-ab65-40e5-943f-f35e98b7618c","Type":"ContainerStarted","Data":"6a0fbb40a11945386d67f20755348f768744d58ddd7c480a24aa566fa312bc22"} Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.153226 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wscwk" event={"ID":"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed","Type":"ContainerStarted","Data":"a7b7d43dc73180475e5ca79bc28d68f819009c4c8a9d4d9ffd09fddcc0c38aae"} Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.154005 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"929774db-0294-4631-b00e-1b664c1d4cba","Type":"ContainerStarted","Data":"ea086dc2401985a9fb77116e3fe08feccb59cfbdcb66fdf8a7732a53b0dc88be"} Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.166839 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"a0c4deae36fbd6888b83491cb53bd4ad9a4b3cad48a12bfa6331042ee58854cf"} Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.169458 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" event={"ID":"5e810938-3cb6-4551-9568-ddd4a4828ed6","Type":"ContainerDied","Data":"e1d9d8ced7e4eda5fa6824c1466c907cbdcf3a6b0c8a0a945bee9f1199f813bd"} Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.169567 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w6qsn" Nov 25 15:24:24 crc kubenswrapper[4965]: E1125 15:24:24.172499 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" podUID="ffd25d05-24d3-4719-87c0-64f16eb4ae50" Nov 25 15:24:24 crc kubenswrapper[4965]: E1125 15:24:24.172584 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-57jrf" podUID="21e02d79-ae37-4b46-a428-62ba4fa40856" Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.308393 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w6qsn"] Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.319569 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w6qsn"] Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.473226 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.595916 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101a9e83-a4ed-43d5-a015-4c52a9fbe424-config\") pod \"101a9e83-a4ed-43d5-a015-4c52a9fbe424\" (UID: \"101a9e83-a4ed-43d5-a015-4c52a9fbe424\") " Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.596127 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fpm6\" (UniqueName: \"kubernetes.io/projected/101a9e83-a4ed-43d5-a015-4c52a9fbe424-kube-api-access-6fpm6\") pod \"101a9e83-a4ed-43d5-a015-4c52a9fbe424\" (UID: \"101a9e83-a4ed-43d5-a015-4c52a9fbe424\") " Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.596482 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/101a9e83-a4ed-43d5-a015-4c52a9fbe424-config" (OuterVolumeSpecName: "config") pod "101a9e83-a4ed-43d5-a015-4c52a9fbe424" (UID: "101a9e83-a4ed-43d5-a015-4c52a9fbe424"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.602568 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/101a9e83-a4ed-43d5-a015-4c52a9fbe424-kube-api-access-6fpm6" (OuterVolumeSpecName: "kube-api-access-6fpm6") pod "101a9e83-a4ed-43d5-a015-4c52a9fbe424" (UID: "101a9e83-a4ed-43d5-a015-4c52a9fbe424"). InnerVolumeSpecName "kube-api-access-6fpm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.698277 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101a9e83-a4ed-43d5-a015-4c52a9fbe424-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.698329 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fpm6\" (UniqueName: \"kubernetes.io/projected/101a9e83-a4ed-43d5-a015-4c52a9fbe424-kube-api-access-6fpm6\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.769703 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 15:24:24 crc kubenswrapper[4965]: I1125 15:24:24.784006 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e810938-3cb6-4551-9568-ddd4a4828ed6" path="/var/lib/kubelet/pods/5e810938-3cb6-4551-9568-ddd4a4828ed6/volumes" Nov 25 15:24:25 crc kubenswrapper[4965]: I1125 15:24:25.023905 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bwsx2"] Nov 25 15:24:25 crc kubenswrapper[4965]: I1125 15:24:25.177820 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d","Type":"ContainerStarted","Data":"f3ee5917d1fb0bed55775b1f027e4fdedc858ecc67edf4a916019a83743daaf4"} Nov 25 15:24:25 crc kubenswrapper[4965]: I1125 15:24:25.179278 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" event={"ID":"101a9e83-a4ed-43d5-a015-4c52a9fbe424","Type":"ContainerDied","Data":"48386f11544cf6564aeabcff03934aa8a9d21fb614b3d1a440e376a4afc548bd"} Nov 25 15:24:25 crc kubenswrapper[4965]: I1125 15:24:25.179129 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vpsvc" Nov 25 15:24:25 crc kubenswrapper[4965]: I1125 15:24:25.180095 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwsx2" event={"ID":"63b33f63-d2e7-48cd-92e3-f47404184ba9","Type":"ContainerStarted","Data":"95aa39e8f7e0d417dc945bb949edb0fa3dc72c81f6f9e64c0e66ead0f4744466"} Nov 25 15:24:25 crc kubenswrapper[4965]: I1125 15:24:25.226638 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vpsvc"] Nov 25 15:24:25 crc kubenswrapper[4965]: I1125 15:24:25.236763 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vpsvc"] Nov 25 15:24:26 crc kubenswrapper[4965]: I1125 15:24:26.781206 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="101a9e83-a4ed-43d5-a015-4c52a9fbe424" path="/var/lib/kubelet/pods/101a9e83-a4ed-43d5-a015-4c52a9fbe424/volumes" Nov 25 15:24:29 crc kubenswrapper[4965]: I1125 15:24:29.210895 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"929774db-0294-4631-b00e-1b664c1d4cba","Type":"ContainerStarted","Data":"935a5a77571a047257ceabe8cda02d36eb4559c6fb65ebb72bdb870f2be1e496"} Nov 25 15:24:29 crc kubenswrapper[4965]: I1125 15:24:29.212887 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d","Type":"ContainerStarted","Data":"568a1fcc6f283e9a02937b6eeb193a49f872402335a746aa32e7a97175930b79"} Nov 25 15:24:30 crc kubenswrapper[4965]: I1125 15:24:30.224167 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wscwk" event={"ID":"b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed","Type":"ContainerStarted","Data":"717aeb080f06bd914a8ec0276ad05ec3469625f4b4bc05c3780cee049a3df5e3"} Nov 25 15:24:30 crc kubenswrapper[4965]: I1125 15:24:30.224467 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wscwk" Nov 25 15:24:30 crc kubenswrapper[4965]: I1125 15:24:30.227460 4965 generic.go:334] "Generic (PLEG): container finished" podID="63b33f63-d2e7-48cd-92e3-f47404184ba9" containerID="b5ed1a44cb1a73297efc2416693f76cde3466097379b0c661158a33741fd2eaf" exitCode=0 Nov 25 15:24:30 crc kubenswrapper[4965]: I1125 15:24:30.227513 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwsx2" event={"ID":"63b33f63-d2e7-48cd-92e3-f47404184ba9","Type":"ContainerDied","Data":"b5ed1a44cb1a73297efc2416693f76cde3466097379b0c661158a33741fd2eaf"} Nov 25 15:24:30 crc kubenswrapper[4965]: I1125 15:24:30.230001 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8bccea83-ab65-40e5-943f-f35e98b7618c","Type":"ContainerStarted","Data":"ae3899ee6e0566d23cc54c8f00e8b7e62adc98332699fc01301c8265345c1008"} Nov 25 15:24:30 crc kubenswrapper[4965]: I1125 15:24:30.230666 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 15:24:30 crc kubenswrapper[4965]: I1125 15:24:30.246383 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wscwk" podStartSLOduration=37.748556997 podStartE2EDuration="42.246365521s" podCreationTimestamp="2025-11-25 15:23:48 +0000 UTC" firstStartedPulling="2025-11-25 15:24:23.261044353 +0000 UTC m=+1208.228638099" lastFinishedPulling="2025-11-25 15:24:27.758852877 +0000 UTC m=+1212.726446623" observedRunningTime="2025-11-25 15:24:30.24041652 +0000 UTC m=+1215.208010326" watchObservedRunningTime="2025-11-25 15:24:30.246365521 +0000 UTC m=+1215.213959267" Nov 25 15:24:30 crc kubenswrapper[4965]: I1125 15:24:30.272786 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=40.321828105 podStartE2EDuration="46.272768549s" podCreationTimestamp="2025-11-25 15:23:44 +0000 UTC" firstStartedPulling="2025-11-25 15:24:23.451852044 +0000 UTC m=+1208.419445790" lastFinishedPulling="2025-11-25 15:24:29.402792488 +0000 UTC m=+1214.370386234" observedRunningTime="2025-11-25 15:24:30.261164934 +0000 UTC m=+1215.228758720" watchObservedRunningTime="2025-11-25 15:24:30.272768549 +0000 UTC m=+1215.240362295" Nov 25 15:24:31 crc kubenswrapper[4965]: I1125 15:24:31.239739 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwsx2" event={"ID":"63b33f63-d2e7-48cd-92e3-f47404184ba9","Type":"ContainerStarted","Data":"4d118db623134620e59253c205075df159f7c4b7f2b0e2332e79f8ad07cadaac"} Nov 25 15:24:31 crc kubenswrapper[4965]: I1125 15:24:31.240176 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwsx2" event={"ID":"63b33f63-d2e7-48cd-92e3-f47404184ba9","Type":"ContainerStarted","Data":"5d5fb4fa562e5e0ac62f1b703f6344a414021a8f32b9899b0d21401bad62076d"} Nov 25 15:24:31 crc kubenswrapper[4965]: I1125 15:24:31.267949 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bwsx2" podStartSLOduration=40.408494634 podStartE2EDuration="43.267931518s" podCreationTimestamp="2025-11-25 15:23:48 +0000 UTC" firstStartedPulling="2025-11-25 15:24:25.054655907 +0000 UTC m=+1210.022249653" lastFinishedPulling="2025-11-25 15:24:27.914092791 +0000 UTC m=+1212.881686537" observedRunningTime="2025-11-25 15:24:31.261902213 +0000 UTC m=+1216.229495959" watchObservedRunningTime="2025-11-25 15:24:31.267931518 +0000 UTC m=+1216.235525264" Nov 25 15:24:32 crc kubenswrapper[4965]: I1125 15:24:32.247991 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:24:32 crc kubenswrapper[4965]: I1125 15:24:32.248387 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:24:34 crc kubenswrapper[4965]: I1125 15:24:34.275551 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"739d03f5-20b2-4c12-9f3e-fbe795ec890d","Type":"ContainerStarted","Data":"cf89e00c635745f9ff3cd4216c52d7f9cf91427b5734375cdbbf6a9cd925aaa5"} Nov 25 15:24:34 crc kubenswrapper[4965]: I1125 15:24:34.701215 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 25 15:24:44 crc kubenswrapper[4965]: E1125 15:24:44.459499 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified" Nov 25 15:24:44 crc kubenswrapper[4965]: E1125 15:24:44.460312 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:n5dchfbhc9h8fh669h5dfh67ch676h65ch5d9h56fhc6h556hc5h58h56dh684hdfh587hb9h566h66ch96h98hb4h69h7h5cfh685h598h56bh647q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fg7gv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(929774db-0294-4631-b00e-1b664c1d4cba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:44 crc kubenswrapper[4965]: E1125 15:24:44.461779 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="929774db-0294-4631-b00e-1b664c1d4cba" Nov 25 15:24:47 crc kubenswrapper[4965]: I1125 15:24:47.973203 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 25 15:24:47 crc kubenswrapper[4965]: I1125 15:24:47.973640 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 25 15:24:48 crc kubenswrapper[4965]: I1125 15:24:48.041069 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 25 15:24:51 crc kubenswrapper[4965]: E1125 15:24:51.885828 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="929774db-0294-4631-b00e-1b664c1d4cba" Nov 25 15:24:51 crc kubenswrapper[4965]: E1125 15:24:51.888909 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Nov 25 15:24:51 crc kubenswrapper[4965]: E1125 15:24:51.889136 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wfv77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(362085ca-1948-4f56-8add-3e727c63e58e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:51 crc kubenswrapper[4965]: E1125 15:24:51.890349 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="362085ca-1948-4f56-8add-3e727c63e58e" Nov 25 15:24:52 crc kubenswrapper[4965]: I1125 15:24:52.474345 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 25 15:24:52 crc kubenswrapper[4965]: I1125 15:24:52.972412 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xk9ml"] Nov 25 15:24:52 crc kubenswrapper[4965]: I1125 15:24:52.973437 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:52 crc kubenswrapper[4965]: I1125 15:24:52.985928 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 25 15:24:52 crc kubenswrapper[4965]: I1125 15:24:52.998106 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xk9ml"] Nov 25 15:24:53 crc kubenswrapper[4965]: E1125 15:24:53.013374 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="929774db-0294-4631-b00e-1b664c1d4cba" Nov 25 15:24:53 crc kubenswrapper[4965]: E1125 15:24:53.069197 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified" Nov 25 15:24:53 crc kubenswrapper[4965]: E1125 15:24:53.069604 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:n5c5h64dh59h58ch6bhbdhfdh85h5fbh64h597h588hd6h79h676h57bh75h59fh679h65ch89h57dhf9h5fdh55fhffh5h668h5c9hcfh585h686q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wwrx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(8ea3b8e7-7b5d-46e0-b07d-33db65d5305d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:24:53 crc kubenswrapper[4965]: E1125 15:24:53.071899 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="8ea3b8e7-7b5d-46e0-b07d-33db65d5305d" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.110527 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10034cec-97f3-4270-a5ab-e6b589e6ac13-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.110603 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10034cec-97f3-4270-a5ab-e6b589e6ac13-config\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.110665 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/10034cec-97f3-4270-a5ab-e6b589e6ac13-ovn-rundir\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.110727 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/10034cec-97f3-4270-a5ab-e6b589e6ac13-ovs-rundir\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.110891 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10034cec-97f3-4270-a5ab-e6b589e6ac13-combined-ca-bundle\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.110934 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsq4x\" (UniqueName: \"kubernetes.io/projected/10034cec-97f3-4270-a5ab-e6b589e6ac13-kube-api-access-qsq4x\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.201769 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4m6dq"] Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.213248 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10034cec-97f3-4270-a5ab-e6b589e6ac13-combined-ca-bundle\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.213295 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsq4x\" (UniqueName: \"kubernetes.io/projected/10034cec-97f3-4270-a5ab-e6b589e6ac13-kube-api-access-qsq4x\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.213327 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10034cec-97f3-4270-a5ab-e6b589e6ac13-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.213344 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10034cec-97f3-4270-a5ab-e6b589e6ac13-config\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.213368 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/10034cec-97f3-4270-a5ab-e6b589e6ac13-ovn-rundir\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.213390 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/10034cec-97f3-4270-a5ab-e6b589e6ac13-ovs-rundir\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.213700 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/10034cec-97f3-4270-a5ab-e6b589e6ac13-ovs-rundir\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.214218 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/10034cec-97f3-4270-a5ab-e6b589e6ac13-ovn-rundir\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.214653 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10034cec-97f3-4270-a5ab-e6b589e6ac13-config\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.223565 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10034cec-97f3-4270-a5ab-e6b589e6ac13-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.223609 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10034cec-97f3-4270-a5ab-e6b589e6ac13-combined-ca-bundle\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.258511 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f68pz"] Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.259669 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.269705 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsq4x\" (UniqueName: \"kubernetes.io/projected/10034cec-97f3-4270-a5ab-e6b589e6ac13-kube-api-access-qsq4x\") pod \"ovn-controller-metrics-xk9ml\" (UID: \"10034cec-97f3-4270-a5ab-e6b589e6ac13\") " pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.274289 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.299571 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f68pz"] Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.317393 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xk9ml" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.418659 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwhjc\" (UniqueName: \"kubernetes.io/projected/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-kube-api-access-xwhjc\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.418748 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.418784 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.418820 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-config\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.451127 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-57jrf"] Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.501114 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bn4gq"] Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.502408 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.507240 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.520787 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bn4gq"] Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.524588 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-config\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.524624 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.524650 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.524669 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7bz7\" (UniqueName: \"kubernetes.io/projected/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-kube-api-access-z7bz7\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.524694 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-config\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.524871 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwhjc\" (UniqueName: \"kubernetes.io/projected/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-kube-api-access-xwhjc\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.525115 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.525159 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.525190 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.525903 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.525960 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-config\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.526475 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.550470 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwhjc\" (UniqueName: \"kubernetes.io/projected/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-kube-api-access-xwhjc\") pod \"dnsmasq-dns-7fd796d7df-f68pz\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: E1125 15:24:53.567312 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="8ea3b8e7-7b5d-46e0-b07d-33db65d5305d" Nov 25 15:24:53 crc kubenswrapper[4965]: E1125 15:24:53.567602 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="929774db-0294-4631-b00e-1b664c1d4cba" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.589518 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.634676 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.635043 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.635079 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7bz7\" (UniqueName: \"kubernetes.io/projected/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-kube-api-access-z7bz7\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.638130 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.638130 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.639458 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-config\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.639880 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.641832 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.648206 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-config\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.652343 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7bz7\" (UniqueName: \"kubernetes.io/projected/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-kube-api-access-z7bz7\") pod \"dnsmasq-dns-86db49b7ff-bn4gq\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:53 crc kubenswrapper[4965]: I1125 15:24:53.827433 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.224938 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xk9ml"] Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.240805 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f68pz"] Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.441136 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bn4gq"] Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.450215 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" event={"ID":"af526b85-9598-43f5-90d5-bdc6f1a0eb1f","Type":"ContainerStarted","Data":"2da553979127205917b4d62ac9a5b200e68a20e616c820e4c65ce375f6fd4e55"} Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.461776 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" event={"ID":"ffd25d05-24d3-4719-87c0-64f16eb4ae50","Type":"ContainerStarted","Data":"7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba"} Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.465420 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" event={"ID":"6623ace9-6fe1-4f1b-bec4-84ec013f20bd","Type":"ContainerStarted","Data":"1824131f9e827b9da93b35eb3f8136f8aea21468d28e25dd34e4c0e6faf8afe5"} Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.479710 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"67d0186d-7eca-48a0-9cc8-56ce4d1caa38","Type":"ContainerStarted","Data":"757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d"} Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.486113 4965 generic.go:334] "Generic (PLEG): container finished" podID="21e02d79-ae37-4b46-a428-62ba4fa40856" containerID="ab7926f15804f7dd2533cf531fdda029468cc90c05da6c0803edcdf65aaf8d5f" exitCode=0 Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.486181 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-57jrf" event={"ID":"21e02d79-ae37-4b46-a428-62ba4fa40856","Type":"ContainerDied","Data":"ab7926f15804f7dd2533cf531fdda029468cc90c05da6c0803edcdf65aaf8d5f"} Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.489103 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xk9ml" event={"ID":"10034cec-97f3-4270-a5ab-e6b589e6ac13","Type":"ContainerStarted","Data":"66e5d9183888d42e8e6bfa8fcbf9d124ab7768f03c6f60877a9a3d04bb3f4d0d"} Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.490323 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3d840d31-e83e-45b7-9863-1e747d7a1290","Type":"ContainerStarted","Data":"509ec0b2f3c86ff78ee3ac5187282b6417ea58fb81fcbbc1ab532e9ef81d1428"} Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.490587 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.492209 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f","Type":"ContainerStarted","Data":"65645db652073cd1a2b28024fcde820c384d296b5c163f551032494ceab8219a"} Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.531514 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.500099947 podStartE2EDuration="1m12.531489133s" podCreationTimestamp="2025-11-25 15:23:42 +0000 UTC" firstStartedPulling="2025-11-25 15:23:43.577216806 +0000 UTC m=+1168.544810542" lastFinishedPulling="2025-11-25 15:24:53.608605982 +0000 UTC m=+1238.576199728" observedRunningTime="2025-11-25 15:24:54.519683202 +0000 UTC m=+1239.487276948" watchObservedRunningTime="2025-11-25 15:24:54.531489133 +0000 UTC m=+1239.499082879" Nov 25 15:24:54 crc kubenswrapper[4965]: E1125 15:24:54.534199 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffd25d05_24d3_4719_87c0_64f16eb4ae50.slice/crio-7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba.scope\": RecentStats: unable to find data in memory cache]" Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.837032 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.909577 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.916911 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 25 15:24:54 crc kubenswrapper[4965]: I1125 15:24:54.993690 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.002148 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-config\") pod \"21e02d79-ae37-4b46-a428-62ba4fa40856\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.002195 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-dns-svc\") pod \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.002254 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-config\") pod \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.002281 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tktqd\" (UniqueName: \"kubernetes.io/projected/ffd25d05-24d3-4719-87c0-64f16eb4ae50-kube-api-access-tktqd\") pod \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\" (UID: \"ffd25d05-24d3-4719-87c0-64f16eb4ae50\") " Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.002354 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rspx\" (UniqueName: \"kubernetes.io/projected/21e02d79-ae37-4b46-a428-62ba4fa40856-kube-api-access-6rspx\") pod \"21e02d79-ae37-4b46-a428-62ba4fa40856\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.002449 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-dns-svc\") pod \"21e02d79-ae37-4b46-a428-62ba4fa40856\" (UID: \"21e02d79-ae37-4b46-a428-62ba4fa40856\") " Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.011143 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e02d79-ae37-4b46-a428-62ba4fa40856-kube-api-access-6rspx" (OuterVolumeSpecName: "kube-api-access-6rspx") pod "21e02d79-ae37-4b46-a428-62ba4fa40856" (UID: "21e02d79-ae37-4b46-a428-62ba4fa40856"). InnerVolumeSpecName "kube-api-access-6rspx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.026181 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd25d05-24d3-4719-87c0-64f16eb4ae50-kube-api-access-tktqd" (OuterVolumeSpecName: "kube-api-access-tktqd") pod "ffd25d05-24d3-4719-87c0-64f16eb4ae50" (UID: "ffd25d05-24d3-4719-87c0-64f16eb4ae50"). InnerVolumeSpecName "kube-api-access-tktqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.036885 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-config" (OuterVolumeSpecName: "config") pod "21e02d79-ae37-4b46-a428-62ba4fa40856" (UID: "21e02d79-ae37-4b46-a428-62ba4fa40856"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.057257 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ffd25d05-24d3-4719-87c0-64f16eb4ae50" (UID: "ffd25d05-24d3-4719-87c0-64f16eb4ae50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.065297 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-config" (OuterVolumeSpecName: "config") pod "ffd25d05-24d3-4719-87c0-64f16eb4ae50" (UID: "ffd25d05-24d3-4719-87c0-64f16eb4ae50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.071935 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21e02d79-ae37-4b46-a428-62ba4fa40856" (UID: "21e02d79-ae37-4b46-a428-62ba4fa40856"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.106239 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.106268 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tktqd\" (UniqueName: \"kubernetes.io/projected/ffd25d05-24d3-4719-87c0-64f16eb4ae50-kube-api-access-tktqd\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.106278 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rspx\" (UniqueName: \"kubernetes.io/projected/21e02d79-ae37-4b46-a428-62ba4fa40856-kube-api-access-6rspx\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.106286 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.106294 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21e02d79-ae37-4b46-a428-62ba4fa40856-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.106301 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffd25d05-24d3-4719-87c0-64f16eb4ae50-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.506036 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-57jrf" event={"ID":"21e02d79-ae37-4b46-a428-62ba4fa40856","Type":"ContainerDied","Data":"fd4b497efe30b0a340f435142ea03015e964e0d1a693b28d5ba3129e78fade18"} Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.506045 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-57jrf" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.506113 4965 scope.go:117] "RemoveContainer" containerID="ab7926f15804f7dd2533cf531fdda029468cc90c05da6c0803edcdf65aaf8d5f" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.514722 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8ea3b8e7-7b5d-46e0-b07d-33db65d5305d","Type":"ContainerStarted","Data":"a392568c9cba668ea7566c403a87c3b0fd428c07b1895e7613d71a3c5e790ae4"} Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.516818 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.520485 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xk9ml" event={"ID":"10034cec-97f3-4270-a5ab-e6b589e6ac13","Type":"ContainerStarted","Data":"0f2122099933857a3bce9691c6be5a720a18d1d7d08b8f00f08086773ea2ee79"} Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.527279 4965 generic.go:334] "Generic (PLEG): container finished" podID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerID="c42488b332d0384f990165aadd0b22d646ddf4a5c4b5bf6584158680c709e42b" exitCode=0 Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.527389 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" event={"ID":"af526b85-9598-43f5-90d5-bdc6f1a0eb1f","Type":"ContainerDied","Data":"c42488b332d0384f990165aadd0b22d646ddf4a5c4b5bf6584158680c709e42b"} Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.529413 4965 generic.go:334] "Generic (PLEG): container finished" podID="ffd25d05-24d3-4719-87c0-64f16eb4ae50" containerID="7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba" exitCode=0 Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.529466 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" event={"ID":"ffd25d05-24d3-4719-87c0-64f16eb4ae50","Type":"ContainerDied","Data":"7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba"} Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.529487 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" event={"ID":"ffd25d05-24d3-4719-87c0-64f16eb4ae50","Type":"ContainerDied","Data":"8489133a6dc0756ce299147a8b991c59982b1638ab26b753026d71da4b5a7677"} Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.529549 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4m6dq" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.536311 4965 generic.go:334] "Generic (PLEG): container finished" podID="6623ace9-6fe1-4f1b-bec4-84ec013f20bd" containerID="16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63" exitCode=0 Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.536492 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" event={"ID":"6623ace9-6fe1-4f1b-bec4-84ec013f20bd","Type":"ContainerDied","Data":"16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63"} Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.552288 4965 scope.go:117] "RemoveContainer" containerID="7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.560823 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=61.553617245 podStartE2EDuration="1m5.56079703s" podCreationTimestamp="2025-11-25 15:23:50 +0000 UTC" firstStartedPulling="2025-11-25 15:24:24.795743811 +0000 UTC m=+1209.763337557" lastFinishedPulling="2025-11-25 15:24:28.802923596 +0000 UTC m=+1213.770517342" observedRunningTime="2025-11-25 15:24:55.55120622 +0000 UTC m=+1240.518799976" watchObservedRunningTime="2025-11-25 15:24:55.56079703 +0000 UTC m=+1240.528390786" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.576960 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.604896 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xk9ml" podStartSLOduration=3.010332733 podStartE2EDuration="3.60487538s" podCreationTimestamp="2025-11-25 15:24:52 +0000 UTC" firstStartedPulling="2025-11-25 15:24:54.2432411 +0000 UTC m=+1239.210834836" lastFinishedPulling="2025-11-25 15:24:54.837783737 +0000 UTC m=+1239.805377483" observedRunningTime="2025-11-25 15:24:55.602423913 +0000 UTC m=+1240.570017659" watchObservedRunningTime="2025-11-25 15:24:55.60487538 +0000 UTC m=+1240.572469126" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.717815 4965 scope.go:117] "RemoveContainer" containerID="7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba" Nov 25 15:24:55 crc kubenswrapper[4965]: E1125 15:24:55.730605 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba\": container with ID starting with 7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba not found: ID does not exist" containerID="7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.730655 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba"} err="failed to get container status \"7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba\": rpc error: code = NotFound desc = could not find container \"7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba\": container with ID starting with 7389ebab660113fa5a67574fb06dd283fca62143bc0d7154621bd0f3055222ba not found: ID does not exist" Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.755777 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4m6dq"] Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.763508 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4m6dq"] Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.813075 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-57jrf"] Nov 25 15:24:55 crc kubenswrapper[4965]: I1125 15:24:55.823190 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-57jrf"] Nov 25 15:24:56 crc kubenswrapper[4965]: I1125 15:24:56.549706 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" event={"ID":"6623ace9-6fe1-4f1b-bec4-84ec013f20bd","Type":"ContainerStarted","Data":"38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6"} Nov 25 15:24:56 crc kubenswrapper[4965]: I1125 15:24:56.549763 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:24:56 crc kubenswrapper[4965]: I1125 15:24:56.553121 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" event={"ID":"af526b85-9598-43f5-90d5-bdc6f1a0eb1f","Type":"ContainerStarted","Data":"de9a870f43d29aee7d0caf5f7191254e3b7ea5f9b5cb6402c1f1aeec9baf67ef"} Nov 25 15:24:56 crc kubenswrapper[4965]: I1125 15:24:56.571681 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" podStartSLOduration=3.571664206 podStartE2EDuration="3.571664206s" podCreationTimestamp="2025-11-25 15:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:24:56.568850649 +0000 UTC m=+1241.536444435" watchObservedRunningTime="2025-11-25 15:24:56.571664206 +0000 UTC m=+1241.539257952" Nov 25 15:24:56 crc kubenswrapper[4965]: I1125 15:24:56.593507 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" podStartSLOduration=3.59349054 podStartE2EDuration="3.59349054s" podCreationTimestamp="2025-11-25 15:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:24:56.59274411 +0000 UTC m=+1241.560337856" watchObservedRunningTime="2025-11-25 15:24:56.59349054 +0000 UTC m=+1241.561084286" Nov 25 15:24:56 crc kubenswrapper[4965]: I1125 15:24:56.781305 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e02d79-ae37-4b46-a428-62ba4fa40856" path="/var/lib/kubelet/pods/21e02d79-ae37-4b46-a428-62ba4fa40856/volumes" Nov 25 15:24:56 crc kubenswrapper[4965]: I1125 15:24:56.782000 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd25d05-24d3-4719-87c0-64f16eb4ae50" path="/var/lib/kubelet/pods/ffd25d05-24d3-4719-87c0-64f16eb4ae50/volumes" Nov 25 15:24:57 crc kubenswrapper[4965]: I1125 15:24:57.568621 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:24:58 crc kubenswrapper[4965]: I1125 15:24:58.584895 4965 generic.go:334] "Generic (PLEG): container finished" podID="11c417a7-1f7b-42c4-ba2d-e221bdf95f9f" containerID="65645db652073cd1a2b28024fcde820c384d296b5c163f551032494ceab8219a" exitCode=0 Nov 25 15:24:58 crc kubenswrapper[4965]: I1125 15:24:58.585064 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f","Type":"ContainerDied","Data":"65645db652073cd1a2b28024fcde820c384d296b5c163f551032494ceab8219a"} Nov 25 15:24:59 crc kubenswrapper[4965]: I1125 15:24:59.601633 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"11c417a7-1f7b-42c4-ba2d-e221bdf95f9f","Type":"ContainerStarted","Data":"b5282ccd97493d7403a0e4c7bb884ba45ab2008b32c3416567344b7cdb0e25aa"} Nov 25 15:24:59 crc kubenswrapper[4965]: I1125 15:24:59.635112 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.911881669 podStartE2EDuration="1m19.635091891s" podCreationTimestamp="2025-11-25 15:23:40 +0000 UTC" firstStartedPulling="2025-11-25 15:23:43.29588954 +0000 UTC m=+1168.263483286" lastFinishedPulling="2025-11-25 15:24:53.019099762 +0000 UTC m=+1237.986693508" observedRunningTime="2025-11-25 15:24:59.630505456 +0000 UTC m=+1244.598099242" watchObservedRunningTime="2025-11-25 15:24:59.635091891 +0000 UTC m=+1244.602685637" Nov 25 15:25:02 crc kubenswrapper[4965]: I1125 15:25:02.113092 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 25 15:25:02 crc kubenswrapper[4965]: I1125 15:25:02.113536 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 25 15:25:02 crc kubenswrapper[4965]: I1125 15:25:02.564178 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 25 15:25:03 crc kubenswrapper[4965]: I1125 15:25:03.592154 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:25:03 crc kubenswrapper[4965]: I1125 15:25:03.711744 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wscwk" podUID="b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed" containerName="ovn-controller" probeResult="failure" output=< Nov 25 15:25:03 crc kubenswrapper[4965]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 25 15:25:03 crc kubenswrapper[4965]: > Nov 25 15:25:03 crc kubenswrapper[4965]: I1125 15:25:03.780529 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:25:03 crc kubenswrapper[4965]: I1125 15:25:03.823462 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bwsx2" Nov 25 15:25:03 crc kubenswrapper[4965]: I1125 15:25:03.831359 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:25:03 crc kubenswrapper[4965]: I1125 15:25:03.914155 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f68pz"] Nov 25 15:25:03 crc kubenswrapper[4965]: I1125 15:25:03.914680 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" podUID="6623ace9-6fe1-4f1b-bec4-84ec013f20bd" containerName="dnsmasq-dns" containerID="cri-o://38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6" gracePeriod=10 Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.107067 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wscwk-config-76sqh"] Nov 25 15:25:04 crc kubenswrapper[4965]: E1125 15:25:04.107617 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e02d79-ae37-4b46-a428-62ba4fa40856" containerName="init" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.107633 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e02d79-ae37-4b46-a428-62ba4fa40856" containerName="init" Nov 25 15:25:04 crc kubenswrapper[4965]: E1125 15:25:04.107649 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd25d05-24d3-4719-87c0-64f16eb4ae50" containerName="init" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.107655 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd25d05-24d3-4719-87c0-64f16eb4ae50" containerName="init" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.107795 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd25d05-24d3-4719-87c0-64f16eb4ae50" containerName="init" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.107806 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e02d79-ae37-4b46-a428-62ba4fa40856" containerName="init" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.108311 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.110199 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.124488 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wscwk-config-76sqh"] Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.192681 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpjx2\" (UniqueName: \"kubernetes.io/projected/b70d9678-c0fc-4b3e-9b45-23e66527c888-kube-api-access-mpjx2\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.192730 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-additional-scripts\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.192764 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-scripts\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.192792 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run-ovn\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.192816 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.192872 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-log-ovn\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.296169 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpjx2\" (UniqueName: \"kubernetes.io/projected/b70d9678-c0fc-4b3e-9b45-23e66527c888-kube-api-access-mpjx2\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.296250 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-additional-scripts\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.296314 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-scripts\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.296372 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run-ovn\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.296401 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.296480 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-log-ovn\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.296857 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-log-ovn\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.298272 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-additional-scripts\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.298356 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run-ovn\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.299194 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.301315 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-scripts\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.319652 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpjx2\" (UniqueName: \"kubernetes.io/projected/b70d9678-c0fc-4b3e-9b45-23e66527c888-kube-api-access-mpjx2\") pod \"ovn-controller-wscwk-config-76sqh\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.376003 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.423193 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.506743 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwhjc\" (UniqueName: \"kubernetes.io/projected/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-kube-api-access-xwhjc\") pod \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.509296 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-ovsdbserver-nb\") pod \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.509387 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-dns-svc\") pod \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.509455 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-config\") pod \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\" (UID: \"6623ace9-6fe1-4f1b-bec4-84ec013f20bd\") " Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.511759 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-kube-api-access-xwhjc" (OuterVolumeSpecName: "kube-api-access-xwhjc") pod "6623ace9-6fe1-4f1b-bec4-84ec013f20bd" (UID: "6623ace9-6fe1-4f1b-bec4-84ec013f20bd"). InnerVolumeSpecName "kube-api-access-xwhjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.554522 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6623ace9-6fe1-4f1b-bec4-84ec013f20bd" (UID: "6623ace9-6fe1-4f1b-bec4-84ec013f20bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.559368 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6623ace9-6fe1-4f1b-bec4-84ec013f20bd" (UID: "6623ace9-6fe1-4f1b-bec4-84ec013f20bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.582353 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-config" (OuterVolumeSpecName: "config") pod "6623ace9-6fe1-4f1b-bec4-84ec013f20bd" (UID: "6623ace9-6fe1-4f1b-bec4-84ec013f20bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.611204 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwhjc\" (UniqueName: \"kubernetes.io/projected/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-kube-api-access-xwhjc\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.611247 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.611260 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.611271 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6623ace9-6fe1-4f1b-bec4-84ec013f20bd-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.659017 4965 generic.go:334] "Generic (PLEG): container finished" podID="6623ace9-6fe1-4f1b-bec4-84ec013f20bd" containerID="38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6" exitCode=0 Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.659093 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" event={"ID":"6623ace9-6fe1-4f1b-bec4-84ec013f20bd","Type":"ContainerDied","Data":"38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6"} Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.659123 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" event={"ID":"6623ace9-6fe1-4f1b-bec4-84ec013f20bd","Type":"ContainerDied","Data":"1824131f9e827b9da93b35eb3f8136f8aea21468d28e25dd34e4c0e6faf8afe5"} Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.659149 4965 scope.go:117] "RemoveContainer" containerID="38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.659301 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-f68pz" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.681554 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"362085ca-1948-4f56-8add-3e727c63e58e","Type":"ContainerStarted","Data":"9b716bb0abc41b92d3353630c817fc82e0ef06e894d386a0d5bb0e6f85792392"} Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.698099 4965 scope.go:117] "RemoveContainer" containerID="16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.725556 4965 scope.go:117] "RemoveContainer" containerID="38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6" Nov 25 15:25:04 crc kubenswrapper[4965]: E1125 15:25:04.725906 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6\": container with ID starting with 38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6 not found: ID does not exist" containerID="38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.725950 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6"} err="failed to get container status \"38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6\": rpc error: code = NotFound desc = could not find container \"38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6\": container with ID starting with 38f08ca48007f6988c1649ec6c500cc6c39e84595b6d1a8f15304318b0248bb6 not found: ID does not exist" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.725991 4965 scope.go:117] "RemoveContainer" containerID="16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63" Nov 25 15:25:04 crc kubenswrapper[4965]: E1125 15:25:04.726312 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63\": container with ID starting with 16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63 not found: ID does not exist" containerID="16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.726336 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63"} err="failed to get container status \"16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63\": rpc error: code = NotFound desc = could not find container \"16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63\": container with ID starting with 16b9ab0b5114769d34ec7a0a2785aebbbafb8e1be1ee5f6587c61cda4ece1d63 not found: ID does not exist" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.727738 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f68pz"] Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.733355 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-f68pz"] Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.784715 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6623ace9-6fe1-4f1b-bec4-84ec013f20bd" path="/var/lib/kubelet/pods/6623ace9-6fe1-4f1b-bec4-84ec013f20bd/volumes" Nov 25 15:25:04 crc kubenswrapper[4965]: E1125 15:25:04.800455 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6623ace9_6fe1_4f1b_bec4_84ec013f20bd.slice/crio-1824131f9e827b9da93b35eb3f8136f8aea21468d28e25dd34e4c0e6faf8afe5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6623ace9_6fe1_4f1b_bec4_84ec013f20bd.slice\": RecentStats: unable to find data in memory cache]" Nov 25 15:25:04 crc kubenswrapper[4965]: I1125 15:25:04.876094 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wscwk-config-76sqh"] Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.690275 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"929774db-0294-4631-b00e-1b664c1d4cba","Type":"ContainerStarted","Data":"045de8f68a3e5cd28b19bf65297e45d8699ef00cdca34e69a28ae6fb6bd2afca"} Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.694063 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wscwk-config-76sqh" event={"ID":"b70d9678-c0fc-4b3e-9b45-23e66527c888","Type":"ContainerStarted","Data":"60ff6ebdcd5df07ce60550aa76f7e3b780a9bf131c02982fc5c61be769315d3b"} Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.694107 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wscwk-config-76sqh" event={"ID":"b70d9678-c0fc-4b3e-9b45-23e66527c888","Type":"ContainerStarted","Data":"85f9ecf993b0420735c0b348308de6427cc54207633d3e222b3213ea1bfd15bb"} Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.762107 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=76.061170455 podStartE2EDuration="1m19.762085355s" podCreationTimestamp="2025-11-25 15:23:46 +0000 UTC" firstStartedPulling="2025-11-25 15:24:24.057917976 +0000 UTC m=+1209.025511722" lastFinishedPulling="2025-11-25 15:24:27.758832866 +0000 UTC m=+1212.726426622" observedRunningTime="2025-11-25 15:25:05.726955059 +0000 UTC m=+1250.694548805" watchObservedRunningTime="2025-11-25 15:25:05.762085355 +0000 UTC m=+1250.729679101" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.764463 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wscwk-config-76sqh" podStartSLOduration=1.764449019 podStartE2EDuration="1.764449019s" podCreationTimestamp="2025-11-25 15:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:25:05.7556449 +0000 UTC m=+1250.723238646" watchObservedRunningTime="2025-11-25 15:25:05.764449019 +0000 UTC m=+1250.732042765" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.842183 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 25 15:25:05 crc kubenswrapper[4965]: E1125 15:25:05.842559 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6623ace9-6fe1-4f1b-bec4-84ec013f20bd" containerName="init" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.842576 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6623ace9-6fe1-4f1b-bec4-84ec013f20bd" containerName="init" Nov 25 15:25:05 crc kubenswrapper[4965]: E1125 15:25:05.842589 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6623ace9-6fe1-4f1b-bec4-84ec013f20bd" containerName="dnsmasq-dns" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.842596 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6623ace9-6fe1-4f1b-bec4-84ec013f20bd" containerName="dnsmasq-dns" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.842763 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="6623ace9-6fe1-4f1b-bec4-84ec013f20bd" containerName="dnsmasq-dns" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.843646 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.848710 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.848778 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.848823 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.850165 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gg2dn" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.857941 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.935434 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3054ed-5cc4-4dce-9b59-72ff19700b27-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.935599 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6d8p\" (UniqueName: \"kubernetes.io/projected/4e3054ed-5cc4-4dce-9b59-72ff19700b27-kube-api-access-j6d8p\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.935628 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e3054ed-5cc4-4dce-9b59-72ff19700b27-scripts\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.935656 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3054ed-5cc4-4dce-9b59-72ff19700b27-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.935690 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e3054ed-5cc4-4dce-9b59-72ff19700b27-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.935759 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3054ed-5cc4-4dce-9b59-72ff19700b27-config\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:05 crc kubenswrapper[4965]: I1125 15:25:05.935894 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3054ed-5cc4-4dce-9b59-72ff19700b27-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.037802 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6d8p\" (UniqueName: \"kubernetes.io/projected/4e3054ed-5cc4-4dce-9b59-72ff19700b27-kube-api-access-j6d8p\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.038251 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e3054ed-5cc4-4dce-9b59-72ff19700b27-scripts\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.038290 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3054ed-5cc4-4dce-9b59-72ff19700b27-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.038317 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e3054ed-5cc4-4dce-9b59-72ff19700b27-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.038350 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3054ed-5cc4-4dce-9b59-72ff19700b27-config\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.038376 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3054ed-5cc4-4dce-9b59-72ff19700b27-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.038406 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3054ed-5cc4-4dce-9b59-72ff19700b27-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.038874 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e3054ed-5cc4-4dce-9b59-72ff19700b27-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.039292 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e3054ed-5cc4-4dce-9b59-72ff19700b27-scripts\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.040021 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3054ed-5cc4-4dce-9b59-72ff19700b27-config\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.045527 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3054ed-5cc4-4dce-9b59-72ff19700b27-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.056770 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3054ed-5cc4-4dce-9b59-72ff19700b27-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.057557 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3054ed-5cc4-4dce-9b59-72ff19700b27-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.060800 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6d8p\" (UniqueName: \"kubernetes.io/projected/4e3054ed-5cc4-4dce-9b59-72ff19700b27-kube-api-access-j6d8p\") pod \"ovn-northd-0\" (UID: \"4e3054ed-5cc4-4dce-9b59-72ff19700b27\") " pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.182897 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.240057 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.353947 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.528672 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.708839 4965 generic.go:334] "Generic (PLEG): container finished" podID="b70d9678-c0fc-4b3e-9b45-23e66527c888" containerID="60ff6ebdcd5df07ce60550aa76f7e3b780a9bf131c02982fc5c61be769315d3b" exitCode=0 Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.708907 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wscwk-config-76sqh" event={"ID":"b70d9678-c0fc-4b3e-9b45-23e66527c888","Type":"ContainerDied","Data":"60ff6ebdcd5df07ce60550aa76f7e3b780a9bf131c02982fc5c61be769315d3b"} Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.711651 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4e3054ed-5cc4-4dce-9b59-72ff19700b27","Type":"ContainerStarted","Data":"4cdfc0348f083293290a04af525aae4fba0338bdbdc2304c5823293d74d69b41"} Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.714724 4965 generic.go:334] "Generic (PLEG): container finished" podID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" containerID="cf89e00c635745f9ff3cd4216c52d7f9cf91427b5734375cdbbf6a9cd925aaa5" exitCode=0 Nov 25 15:25:06 crc kubenswrapper[4965]: I1125 15:25:06.715501 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"739d03f5-20b2-4c12-9f3e-fbe795ec890d","Type":"ContainerDied","Data":"cf89e00c635745f9ff3cd4216c52d7f9cf91427b5734375cdbbf6a9cd925aaa5"} Nov 25 15:25:07 crc kubenswrapper[4965]: I1125 15:25:07.727190 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"739d03f5-20b2-4c12-9f3e-fbe795ec890d","Type":"ContainerStarted","Data":"485f97d15b5d763accc387f4ca2c06c6a93725f90d652e5d986b2b9ef4af1f74"} Nov 25 15:25:07 crc kubenswrapper[4965]: I1125 15:25:07.728226 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.120529 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.143463 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.47880227 podStartE2EDuration="1m31.14342569s" podCreationTimestamp="2025-11-25 15:23:37 +0000 UTC" firstStartedPulling="2025-11-25 15:23:39.773022925 +0000 UTC m=+1164.740616671" lastFinishedPulling="2025-11-25 15:24:32.437646345 +0000 UTC m=+1217.405240091" observedRunningTime="2025-11-25 15:25:07.756525213 +0000 UTC m=+1252.724118959" watchObservedRunningTime="2025-11-25 15:25:08.14342569 +0000 UTC m=+1253.111019436" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.295666 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpjx2\" (UniqueName: \"kubernetes.io/projected/b70d9678-c0fc-4b3e-9b45-23e66527c888-kube-api-access-mpjx2\") pod \"b70d9678-c0fc-4b3e-9b45-23e66527c888\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.295737 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run\") pod \"b70d9678-c0fc-4b3e-9b45-23e66527c888\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.295830 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-additional-scripts\") pod \"b70d9678-c0fc-4b3e-9b45-23e66527c888\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.295856 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run-ovn\") pod \"b70d9678-c0fc-4b3e-9b45-23e66527c888\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.295890 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-log-ovn\") pod \"b70d9678-c0fc-4b3e-9b45-23e66527c888\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.296039 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-scripts\") pod \"b70d9678-c0fc-4b3e-9b45-23e66527c888\" (UID: \"b70d9678-c0fc-4b3e-9b45-23e66527c888\") " Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.298110 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run" (OuterVolumeSpecName: "var-run") pod "b70d9678-c0fc-4b3e-9b45-23e66527c888" (UID: "b70d9678-c0fc-4b3e-9b45-23e66527c888"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.298186 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b70d9678-c0fc-4b3e-9b45-23e66527c888" (UID: "b70d9678-c0fc-4b3e-9b45-23e66527c888"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.298186 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-scripts" (OuterVolumeSpecName: "scripts") pod "b70d9678-c0fc-4b3e-9b45-23e66527c888" (UID: "b70d9678-c0fc-4b3e-9b45-23e66527c888"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.298217 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b70d9678-c0fc-4b3e-9b45-23e66527c888" (UID: "b70d9678-c0fc-4b3e-9b45-23e66527c888"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.298385 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b70d9678-c0fc-4b3e-9b45-23e66527c888" (UID: "b70d9678-c0fc-4b3e-9b45-23e66527c888"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.300692 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b70d9678-c0fc-4b3e-9b45-23e66527c888-kube-api-access-mpjx2" (OuterVolumeSpecName: "kube-api-access-mpjx2") pod "b70d9678-c0fc-4b3e-9b45-23e66527c888" (UID: "b70d9678-c0fc-4b3e-9b45-23e66527c888"). InnerVolumeSpecName "kube-api-access-mpjx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.398670 4965 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.398713 4965 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.398728 4965 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.398740 4965 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b70d9678-c0fc-4b3e-9b45-23e66527c888-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.398749 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b70d9678-c0fc-4b3e-9b45-23e66527c888-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.398759 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpjx2\" (UniqueName: \"kubernetes.io/projected/b70d9678-c0fc-4b3e-9b45-23e66527c888-kube-api-access-mpjx2\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.723199 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wscwk" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.734210 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wscwk-config-76sqh" event={"ID":"b70d9678-c0fc-4b3e-9b45-23e66527c888","Type":"ContainerDied","Data":"85f9ecf993b0420735c0b348308de6427cc54207633d3e222b3213ea1bfd15bb"} Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.734244 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f9ecf993b0420735c0b348308de6427cc54207633d3e222b3213ea1bfd15bb" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.734259 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wscwk-config-76sqh" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.738235 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4e3054ed-5cc4-4dce-9b59-72ff19700b27","Type":"ContainerStarted","Data":"98fc32fefb3fa3f9cb3d2c690b5c4cba6514e6f04f6c94e33e0b1756121bf53b"} Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.738279 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4e3054ed-5cc4-4dce-9b59-72ff19700b27","Type":"ContainerStarted","Data":"83f8148f9e57d4dc12bb3c9f86ebd6664b4477d4797efcbdcb91ec1feef33e8f"} Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.738356 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.739820 4965 generic.go:334] "Generic (PLEG): container finished" podID="362085ca-1948-4f56-8add-3e727c63e58e" containerID="9b716bb0abc41b92d3353630c817fc82e0ef06e894d386a0d5bb0e6f85792392" exitCode=0 Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.739897 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"362085ca-1948-4f56-8add-3e727c63e58e","Type":"ContainerDied","Data":"9b716bb0abc41b92d3353630c817fc82e0ef06e894d386a0d5bb0e6f85792392"} Nov 25 15:25:08 crc kubenswrapper[4965]: I1125 15:25:08.813825 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.597799414 podStartE2EDuration="3.813800681s" podCreationTimestamp="2025-11-25 15:25:05 +0000 UTC" firstStartedPulling="2025-11-25 15:25:06.579885207 +0000 UTC m=+1251.547478953" lastFinishedPulling="2025-11-25 15:25:07.795886474 +0000 UTC m=+1252.763480220" observedRunningTime="2025-11-25 15:25:08.809310899 +0000 UTC m=+1253.776904685" watchObservedRunningTime="2025-11-25 15:25:08.813800681 +0000 UTC m=+1253.781394427" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.258538 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wscwk-config-76sqh"] Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.267370 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wscwk-config-76sqh"] Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.342902 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wscwk-config-xbvg9"] Nov 25 15:25:09 crc kubenswrapper[4965]: E1125 15:25:09.343475 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70d9678-c0fc-4b3e-9b45-23e66527c888" containerName="ovn-config" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.343492 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70d9678-c0fc-4b3e-9b45-23e66527c888" containerName="ovn-config" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.343673 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70d9678-c0fc-4b3e-9b45-23e66527c888" containerName="ovn-config" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.344246 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.353629 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.366598 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wscwk-config-xbvg9"] Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.520917 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-log-ovn\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.521041 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-additional-scripts\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.521127 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4rl2\" (UniqueName: \"kubernetes.io/projected/c83dabb6-bee8-4865-8238-b66d6601f3e2-kube-api-access-k4rl2\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.521265 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.521334 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run-ovn\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.521398 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-scripts\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.622649 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.622949 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run-ovn\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.623137 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-scripts\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.623177 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.623227 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run-ovn\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.623243 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-log-ovn\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.623289 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-log-ovn\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.623331 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-additional-scripts\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.623397 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4rl2\" (UniqueName: \"kubernetes.io/projected/c83dabb6-bee8-4865-8238-b66d6601f3e2-kube-api-access-k4rl2\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.624125 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-additional-scripts\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.625304 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-scripts\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.653010 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4rl2\" (UniqueName: \"kubernetes.io/projected/c83dabb6-bee8-4865-8238-b66d6601f3e2-kube-api-access-k4rl2\") pod \"ovn-controller-wscwk-config-xbvg9\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.665385 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.750908 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"362085ca-1948-4f56-8add-3e727c63e58e","Type":"ContainerStarted","Data":"46cc92d463332cdcffdde6cb454fb8a2524e2a99c4b33dbd8cb19a7f101fb8a7"} Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.799472 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371946.055328 podStartE2EDuration="1m30.7994474s" podCreationTimestamp="2025-11-25 15:23:39 +0000 UTC" firstStartedPulling="2025-11-25 15:23:41.280213435 +0000 UTC m=+1166.247807191" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:25:09.780238517 +0000 UTC m=+1254.747832263" watchObservedRunningTime="2025-11-25 15:25:09.7994474 +0000 UTC m=+1254.767041146" Nov 25 15:25:09 crc kubenswrapper[4965]: I1125 15:25:09.982080 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wscwk-config-xbvg9"] Nov 25 15:25:10 crc kubenswrapper[4965]: I1125 15:25:10.652866 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 25 15:25:10 crc kubenswrapper[4965]: I1125 15:25:10.652920 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 25 15:25:10 crc kubenswrapper[4965]: I1125 15:25:10.766534 4965 generic.go:334] "Generic (PLEG): container finished" podID="c83dabb6-bee8-4865-8238-b66d6601f3e2" containerID="21e62dea2ce289ec599ff990be43132fe7f207dc9f4fbdc15c2a9cc8680622ba" exitCode=0 Nov 25 15:25:10 crc kubenswrapper[4965]: I1125 15:25:10.766665 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wscwk-config-xbvg9" event={"ID":"c83dabb6-bee8-4865-8238-b66d6601f3e2","Type":"ContainerDied","Data":"21e62dea2ce289ec599ff990be43132fe7f207dc9f4fbdc15c2a9cc8680622ba"} Nov 25 15:25:10 crc kubenswrapper[4965]: I1125 15:25:10.766743 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wscwk-config-xbvg9" event={"ID":"c83dabb6-bee8-4865-8238-b66d6601f3e2","Type":"ContainerStarted","Data":"12d1736b980e2aa088f10f75b9a32873a3aca84d3166c035ed11473ec3b3080f"} Nov 25 15:25:10 crc kubenswrapper[4965]: I1125 15:25:10.782822 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b70d9678-c0fc-4b3e-9b45-23e66527c888" path="/var/lib/kubelet/pods/b70d9678-c0fc-4b3e-9b45-23e66527c888/volumes" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.096571 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.170229 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-scripts\") pod \"c83dabb6-bee8-4865-8238-b66d6601f3e2\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.170371 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-additional-scripts\") pod \"c83dabb6-bee8-4865-8238-b66d6601f3e2\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.170474 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run-ovn\") pod \"c83dabb6-bee8-4865-8238-b66d6601f3e2\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.170555 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run\") pod \"c83dabb6-bee8-4865-8238-b66d6601f3e2\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.170582 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4rl2\" (UniqueName: \"kubernetes.io/projected/c83dabb6-bee8-4865-8238-b66d6601f3e2-kube-api-access-k4rl2\") pod \"c83dabb6-bee8-4865-8238-b66d6601f3e2\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.170631 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-log-ovn\") pod \"c83dabb6-bee8-4865-8238-b66d6601f3e2\" (UID: \"c83dabb6-bee8-4865-8238-b66d6601f3e2\") " Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.170665 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run" (OuterVolumeSpecName: "var-run") pod "c83dabb6-bee8-4865-8238-b66d6601f3e2" (UID: "c83dabb6-bee8-4865-8238-b66d6601f3e2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.170759 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c83dabb6-bee8-4865-8238-b66d6601f3e2" (UID: "c83dabb6-bee8-4865-8238-b66d6601f3e2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.170692 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c83dabb6-bee8-4865-8238-b66d6601f3e2" (UID: "c83dabb6-bee8-4865-8238-b66d6601f3e2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.171423 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c83dabb6-bee8-4865-8238-b66d6601f3e2" (UID: "c83dabb6-bee8-4865-8238-b66d6601f3e2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.171478 4965 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.171506 4965 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.171518 4965 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c83dabb6-bee8-4865-8238-b66d6601f3e2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.171683 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-scripts" (OuterVolumeSpecName: "scripts") pod "c83dabb6-bee8-4865-8238-b66d6601f3e2" (UID: "c83dabb6-bee8-4865-8238-b66d6601f3e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.190174 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83dabb6-bee8-4865-8238-b66d6601f3e2-kube-api-access-k4rl2" (OuterVolumeSpecName: "kube-api-access-k4rl2") pod "c83dabb6-bee8-4865-8238-b66d6601f3e2" (UID: "c83dabb6-bee8-4865-8238-b66d6601f3e2"). InnerVolumeSpecName "kube-api-access-k4rl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.272766 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4rl2\" (UniqueName: \"kubernetes.io/projected/c83dabb6-bee8-4865-8238-b66d6601f3e2-kube-api-access-k4rl2\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.272800 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.272809 4965 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c83dabb6-bee8-4865-8238-b66d6601f3e2-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.784171 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wscwk-config-xbvg9" event={"ID":"c83dabb6-bee8-4865-8238-b66d6601f3e2","Type":"ContainerDied","Data":"12d1736b980e2aa088f10f75b9a32873a3aca84d3166c035ed11473ec3b3080f"} Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.784226 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12d1736b980e2aa088f10f75b9a32873a3aca84d3166c035ed11473ec3b3080f" Nov 25 15:25:12 crc kubenswrapper[4965]: I1125 15:25:12.784194 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wscwk-config-xbvg9" Nov 25 15:25:13 crc kubenswrapper[4965]: I1125 15:25:13.188093 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wscwk-config-xbvg9"] Nov 25 15:25:13 crc kubenswrapper[4965]: I1125 15:25:13.194931 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wscwk-config-xbvg9"] Nov 25 15:25:14 crc kubenswrapper[4965]: I1125 15:25:14.738672 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 25 15:25:14 crc kubenswrapper[4965]: I1125 15:25:14.795035 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83dabb6-bee8-4865-8238-b66d6601f3e2" path="/var/lib/kubelet/pods/c83dabb6-bee8-4865-8238-b66d6601f3e2/volumes" Nov 25 15:25:14 crc kubenswrapper[4965]: I1125 15:25:14.845255 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.146200 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.511436 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ae27-account-create-2ng9k"] Nov 25 15:25:19 crc kubenswrapper[4965]: E1125 15:25:19.511781 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83dabb6-bee8-4865-8238-b66d6601f3e2" containerName="ovn-config" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.511798 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83dabb6-bee8-4865-8238-b66d6601f3e2" containerName="ovn-config" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.511959 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83dabb6-bee8-4865-8238-b66d6601f3e2" containerName="ovn-config" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.512467 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ae27-account-create-2ng9k" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.518538 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.532710 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ae27-account-create-2ng9k"] Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.598829 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2155fa2-3b94-4413-84de-171185b0d253-operator-scripts\") pod \"barbican-ae27-account-create-2ng9k\" (UID: \"b2155fa2-3b94-4413-84de-171185b0d253\") " pod="openstack/barbican-ae27-account-create-2ng9k" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.598876 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbnwn\" (UniqueName: \"kubernetes.io/projected/b2155fa2-3b94-4413-84de-171185b0d253-kube-api-access-zbnwn\") pod \"barbican-ae27-account-create-2ng9k\" (UID: \"b2155fa2-3b94-4413-84de-171185b0d253\") " pod="openstack/barbican-ae27-account-create-2ng9k" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.618996 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-krr6h"] Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.619929 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-krr6h" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.700892 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbnwn\" (UniqueName: \"kubernetes.io/projected/b2155fa2-3b94-4413-84de-171185b0d253-kube-api-access-zbnwn\") pod \"barbican-ae27-account-create-2ng9k\" (UID: \"b2155fa2-3b94-4413-84de-171185b0d253\") " pod="openstack/barbican-ae27-account-create-2ng9k" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.701021 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/656770af-f30e-490a-9987-71cc29c5e278-operator-scripts\") pod \"barbican-db-create-krr6h\" (UID: \"656770af-f30e-490a-9987-71cc29c5e278\") " pod="openstack/barbican-db-create-krr6h" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.701042 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9vjn\" (UniqueName: \"kubernetes.io/projected/656770af-f30e-490a-9987-71cc29c5e278-kube-api-access-n9vjn\") pod \"barbican-db-create-krr6h\" (UID: \"656770af-f30e-490a-9987-71cc29c5e278\") " pod="openstack/barbican-db-create-krr6h" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.701146 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2155fa2-3b94-4413-84de-171185b0d253-operator-scripts\") pod \"barbican-ae27-account-create-2ng9k\" (UID: \"b2155fa2-3b94-4413-84de-171185b0d253\") " pod="openstack/barbican-ae27-account-create-2ng9k" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.706814 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-krr6h"] Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.713640 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-k2wml"] Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.714703 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k2wml" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.753401 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a38e-account-create-jh8kh"] Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.754362 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a38e-account-create-jh8kh" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.758772 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.771020 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k2wml"] Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.776567 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a38e-account-create-jh8kh"] Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.803185 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-operator-scripts\") pod \"cinder-a38e-account-create-jh8kh\" (UID: \"93df8a0d-1ef0-4598-8253-92c5bbeb95aa\") " pod="openstack/cinder-a38e-account-create-jh8kh" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.803230 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c5ln\" (UniqueName: \"kubernetes.io/projected/6c898509-daeb-4566-b5b1-39f4b742b5d0-kube-api-access-5c5ln\") pod \"cinder-db-create-k2wml\" (UID: \"6c898509-daeb-4566-b5b1-39f4b742b5d0\") " pod="openstack/cinder-db-create-k2wml" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.803252 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msrkw\" (UniqueName: \"kubernetes.io/projected/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-kube-api-access-msrkw\") pod \"cinder-a38e-account-create-jh8kh\" (UID: \"93df8a0d-1ef0-4598-8253-92c5bbeb95aa\") " pod="openstack/cinder-a38e-account-create-jh8kh" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.803302 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c898509-daeb-4566-b5b1-39f4b742b5d0-operator-scripts\") pod \"cinder-db-create-k2wml\" (UID: \"6c898509-daeb-4566-b5b1-39f4b742b5d0\") " pod="openstack/cinder-db-create-k2wml" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.803489 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/656770af-f30e-490a-9987-71cc29c5e278-operator-scripts\") pod \"barbican-db-create-krr6h\" (UID: \"656770af-f30e-490a-9987-71cc29c5e278\") " pod="openstack/barbican-db-create-krr6h" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.803561 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9vjn\" (UniqueName: \"kubernetes.io/projected/656770af-f30e-490a-9987-71cc29c5e278-kube-api-access-n9vjn\") pod \"barbican-db-create-krr6h\" (UID: \"656770af-f30e-490a-9987-71cc29c5e278\") " pod="openstack/barbican-db-create-krr6h" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.905725 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c898509-daeb-4566-b5b1-39f4b742b5d0-operator-scripts\") pod \"cinder-db-create-k2wml\" (UID: \"6c898509-daeb-4566-b5b1-39f4b742b5d0\") " pod="openstack/cinder-db-create-k2wml" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.905890 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-operator-scripts\") pod \"cinder-a38e-account-create-jh8kh\" (UID: \"93df8a0d-1ef0-4598-8253-92c5bbeb95aa\") " pod="openstack/cinder-a38e-account-create-jh8kh" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.905920 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c5ln\" (UniqueName: \"kubernetes.io/projected/6c898509-daeb-4566-b5b1-39f4b742b5d0-kube-api-access-5c5ln\") pod \"cinder-db-create-k2wml\" (UID: \"6c898509-daeb-4566-b5b1-39f4b742b5d0\") " pod="openstack/cinder-db-create-k2wml" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.905936 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msrkw\" (UniqueName: \"kubernetes.io/projected/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-kube-api-access-msrkw\") pod \"cinder-a38e-account-create-jh8kh\" (UID: \"93df8a0d-1ef0-4598-8253-92c5bbeb95aa\") " pod="openstack/cinder-a38e-account-create-jh8kh" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.953794 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-f6bkw"] Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.955155 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f6bkw" Nov 25 15:25:19 crc kubenswrapper[4965]: I1125 15:25:19.973951 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-f6bkw"] Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.006818 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tth5\" (UniqueName: \"kubernetes.io/projected/61094cee-413d-4b91-ace8-69afdbaa6226-kube-api-access-4tth5\") pod \"neutron-db-create-f6bkw\" (UID: \"61094cee-413d-4b91-ace8-69afdbaa6226\") " pod="openstack/neutron-db-create-f6bkw" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.007069 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61094cee-413d-4b91-ace8-69afdbaa6226-operator-scripts\") pod \"neutron-db-create-f6bkw\" (UID: \"61094cee-413d-4b91-ace8-69afdbaa6226\") " pod="openstack/neutron-db-create-f6bkw" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.027852 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-17b8-account-create-f8lf2"] Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.029016 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-17b8-account-create-f8lf2" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.039830 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-17b8-account-create-f8lf2"] Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.042708 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.108843 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ff1c96-a8a8-431e-ab9b-a76d3992463f-operator-scripts\") pod \"neutron-17b8-account-create-f8lf2\" (UID: \"59ff1c96-a8a8-431e-ab9b-a76d3992463f\") " pod="openstack/neutron-17b8-account-create-f8lf2" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.108928 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5gg7\" (UniqueName: \"kubernetes.io/projected/59ff1c96-a8a8-431e-ab9b-a76d3992463f-kube-api-access-w5gg7\") pod \"neutron-17b8-account-create-f8lf2\" (UID: \"59ff1c96-a8a8-431e-ab9b-a76d3992463f\") " pod="openstack/neutron-17b8-account-create-f8lf2" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.109031 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61094cee-413d-4b91-ace8-69afdbaa6226-operator-scripts\") pod \"neutron-db-create-f6bkw\" (UID: \"61094cee-413d-4b91-ace8-69afdbaa6226\") " pod="openstack/neutron-db-create-f6bkw" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.109096 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tth5\" (UniqueName: \"kubernetes.io/projected/61094cee-413d-4b91-ace8-69afdbaa6226-kube-api-access-4tth5\") pod \"neutron-db-create-f6bkw\" (UID: \"61094cee-413d-4b91-ace8-69afdbaa6226\") " pod="openstack/neutron-db-create-f6bkw" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.168759 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2155fa2-3b94-4413-84de-171185b0d253-operator-scripts\") pod \"barbican-ae27-account-create-2ng9k\" (UID: \"b2155fa2-3b94-4413-84de-171185b0d253\") " pod="openstack/barbican-ae27-account-create-2ng9k" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.169040 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c898509-daeb-4566-b5b1-39f4b742b5d0-operator-scripts\") pod \"cinder-db-create-k2wml\" (UID: \"6c898509-daeb-4566-b5b1-39f4b742b5d0\") " pod="openstack/cinder-db-create-k2wml" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.169164 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/656770af-f30e-490a-9987-71cc29c5e278-operator-scripts\") pod \"barbican-db-create-krr6h\" (UID: \"656770af-f30e-490a-9987-71cc29c5e278\") " pod="openstack/barbican-db-create-krr6h" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.169183 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-operator-scripts\") pod \"cinder-a38e-account-create-jh8kh\" (UID: \"93df8a0d-1ef0-4598-8253-92c5bbeb95aa\") " pod="openstack/cinder-a38e-account-create-jh8kh" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.171988 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61094cee-413d-4b91-ace8-69afdbaa6226-operator-scripts\") pod \"neutron-db-create-f6bkw\" (UID: \"61094cee-413d-4b91-ace8-69afdbaa6226\") " pod="openstack/neutron-db-create-f6bkw" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.176024 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbnwn\" (UniqueName: \"kubernetes.io/projected/b2155fa2-3b94-4413-84de-171185b0d253-kube-api-access-zbnwn\") pod \"barbican-ae27-account-create-2ng9k\" (UID: \"b2155fa2-3b94-4413-84de-171185b0d253\") " pod="openstack/barbican-ae27-account-create-2ng9k" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.176496 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msrkw\" (UniqueName: \"kubernetes.io/projected/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-kube-api-access-msrkw\") pod \"cinder-a38e-account-create-jh8kh\" (UID: \"93df8a0d-1ef0-4598-8253-92c5bbeb95aa\") " pod="openstack/cinder-a38e-account-create-jh8kh" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.182616 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tth5\" (UniqueName: \"kubernetes.io/projected/61094cee-413d-4b91-ace8-69afdbaa6226-kube-api-access-4tth5\") pod \"neutron-db-create-f6bkw\" (UID: \"61094cee-413d-4b91-ace8-69afdbaa6226\") " pod="openstack/neutron-db-create-f6bkw" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.194697 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c5ln\" (UniqueName: \"kubernetes.io/projected/6c898509-daeb-4566-b5b1-39f4b742b5d0-kube-api-access-5c5ln\") pod \"cinder-db-create-k2wml\" (UID: \"6c898509-daeb-4566-b5b1-39f4b742b5d0\") " pod="openstack/cinder-db-create-k2wml" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.195139 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9vjn\" (UniqueName: \"kubernetes.io/projected/656770af-f30e-490a-9987-71cc29c5e278-kube-api-access-n9vjn\") pod \"barbican-db-create-krr6h\" (UID: \"656770af-f30e-490a-9987-71cc29c5e278\") " pod="openstack/barbican-db-create-krr6h" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.210118 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5gg7\" (UniqueName: \"kubernetes.io/projected/59ff1c96-a8a8-431e-ab9b-a76d3992463f-kube-api-access-w5gg7\") pod \"neutron-17b8-account-create-f8lf2\" (UID: \"59ff1c96-a8a8-431e-ab9b-a76d3992463f\") " pod="openstack/neutron-17b8-account-create-f8lf2" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.210307 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ff1c96-a8a8-431e-ab9b-a76d3992463f-operator-scripts\") pod \"neutron-17b8-account-create-f8lf2\" (UID: \"59ff1c96-a8a8-431e-ab9b-a76d3992463f\") " pod="openstack/neutron-17b8-account-create-f8lf2" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.210959 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ff1c96-a8a8-431e-ab9b-a76d3992463f-operator-scripts\") pod \"neutron-17b8-account-create-f8lf2\" (UID: \"59ff1c96-a8a8-431e-ab9b-a76d3992463f\") " pod="openstack/neutron-17b8-account-create-f8lf2" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.233355 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-krr6h" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.234350 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5gg7\" (UniqueName: \"kubernetes.io/projected/59ff1c96-a8a8-431e-ab9b-a76d3992463f-kube-api-access-w5gg7\") pod \"neutron-17b8-account-create-f8lf2\" (UID: \"59ff1c96-a8a8-431e-ab9b-a76d3992463f\") " pod="openstack/neutron-17b8-account-create-f8lf2" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.330552 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k2wml" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.429794 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ae27-account-create-2ng9k" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.469299 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a38e-account-create-jh8kh" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.471485 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f6bkw" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.474509 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-17b8-account-create-f8lf2" Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.680096 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-krr6h"] Nov 25 15:25:20 crc kubenswrapper[4965]: I1125 15:25:20.867511 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-krr6h" event={"ID":"656770af-f30e-490a-9987-71cc29c5e278","Type":"ContainerStarted","Data":"50b15f4249d44dc1b2854d915c720690132c9c1817a4e852c1f9841e8a9104b0"} Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.382932 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.457443 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k2wml"] Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.568133 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-17b8-account-create-f8lf2"] Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.637137 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-f6bkw"] Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.666713 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ae27-account-create-2ng9k"] Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.681207 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a38e-account-create-jh8kh"] Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.876103 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-17b8-account-create-f8lf2" event={"ID":"59ff1c96-a8a8-431e-ab9b-a76d3992463f","Type":"ContainerStarted","Data":"1030d7dc06e8080135fd865c891265f319d7a05846ef1304b0e7a6b0668b8f96"} Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.876444 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-17b8-account-create-f8lf2" event={"ID":"59ff1c96-a8a8-431e-ab9b-a76d3992463f","Type":"ContainerStarted","Data":"c68add4d133378421d45351b6a0d688da835466a7769c870326f2edf2354767e"} Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.877827 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-krr6h" event={"ID":"656770af-f30e-490a-9987-71cc29c5e278","Type":"ContainerStarted","Data":"0d489402ee0d82ef91bca3b81442c53d1028892727879e49e5b32a9c9f7e450d"} Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.878998 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a38e-account-create-jh8kh" event={"ID":"93df8a0d-1ef0-4598-8253-92c5bbeb95aa","Type":"ContainerStarted","Data":"a8ca10322411fb849f304f7c51ece574f1d367303eb9c36e3d8aa51cb1cf81ac"} Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.881008 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ae27-account-create-2ng9k" event={"ID":"b2155fa2-3b94-4413-84de-171185b0d253","Type":"ContainerStarted","Data":"8493723935d6c511524e335baaf35c2d479ac17adf196ce415ba3c71a8540dbf"} Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.882618 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f6bkw" event={"ID":"61094cee-413d-4b91-ace8-69afdbaa6226","Type":"ContainerStarted","Data":"7c9d7352c9333111cff5e7c54aece7b0c625c0e1a2688d7fc5259cfb74f657a9"} Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.884342 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k2wml" event={"ID":"6c898509-daeb-4566-b5b1-39f4b742b5d0","Type":"ContainerStarted","Data":"5bdb486b59ee958aed2ea02dd64e3833a4ae9389982b8559b3f55aac39793306"} Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.884378 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k2wml" event={"ID":"6c898509-daeb-4566-b5b1-39f4b742b5d0","Type":"ContainerStarted","Data":"a8737c624fc387eab4d553642886cb618cec8bd68e5af93b0c94a96b2a9ada93"} Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.899868 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-krr6h" podStartSLOduration=2.899846891 podStartE2EDuration="2.899846891s" podCreationTimestamp="2025-11-25 15:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:25:21.892398729 +0000 UTC m=+1266.859992475" watchObservedRunningTime="2025-11-25 15:25:21.899846891 +0000 UTC m=+1266.867440637" Nov 25 15:25:21 crc kubenswrapper[4965]: I1125 15:25:21.918469 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-k2wml" podStartSLOduration=2.9184507870000003 podStartE2EDuration="2.918450787s" podCreationTimestamp="2025-11-25 15:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:25:21.916767331 +0000 UTC m=+1266.884361077" watchObservedRunningTime="2025-11-25 15:25:21.918450787 +0000 UTC m=+1266.886044533" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.143192 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jlj9w"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.144832 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jlj9w" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.155885 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jlj9w"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.248312 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98n99\" (UniqueName: \"kubernetes.io/projected/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-kube-api-access-98n99\") pod \"keystone-db-create-jlj9w\" (UID: \"78ed68a6-1a16-4e43-86a2-7901c1c7aa35\") " pod="openstack/keystone-db-create-jlj9w" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.248381 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-operator-scripts\") pod \"keystone-db-create-jlj9w\" (UID: \"78ed68a6-1a16-4e43-86a2-7901c1c7aa35\") " pod="openstack/keystone-db-create-jlj9w" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.339753 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4468-account-create-vm2dh"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.341415 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4468-account-create-vm2dh" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.346786 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.347480 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4468-account-create-vm2dh"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.350410 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98n99\" (UniqueName: \"kubernetes.io/projected/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-kube-api-access-98n99\") pod \"keystone-db-create-jlj9w\" (UID: \"78ed68a6-1a16-4e43-86a2-7901c1c7aa35\") " pod="openstack/keystone-db-create-jlj9w" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.350503 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-operator-scripts\") pod \"keystone-db-create-jlj9w\" (UID: \"78ed68a6-1a16-4e43-86a2-7901c1c7aa35\") " pod="openstack/keystone-db-create-jlj9w" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.351670 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-operator-scripts\") pod \"keystone-db-create-jlj9w\" (UID: \"78ed68a6-1a16-4e43-86a2-7901c1c7aa35\") " pod="openstack/keystone-db-create-jlj9w" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.372855 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98n99\" (UniqueName: \"kubernetes.io/projected/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-kube-api-access-98n99\") pod \"keystone-db-create-jlj9w\" (UID: \"78ed68a6-1a16-4e43-86a2-7901c1c7aa35\") " pod="openstack/keystone-db-create-jlj9w" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.453946 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxdcd\" (UniqueName: \"kubernetes.io/projected/ea9637eb-b283-42b2-9089-b0bca2df1b8a-kube-api-access-dxdcd\") pod \"keystone-4468-account-create-vm2dh\" (UID: \"ea9637eb-b283-42b2-9089-b0bca2df1b8a\") " pod="openstack/keystone-4468-account-create-vm2dh" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.454313 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea9637eb-b283-42b2-9089-b0bca2df1b8a-operator-scripts\") pod \"keystone-4468-account-create-vm2dh\" (UID: \"ea9637eb-b283-42b2-9089-b0bca2df1b8a\") " pod="openstack/keystone-4468-account-create-vm2dh" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.539273 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-s5wpj"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.542219 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s5wpj" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.548722 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s5wpj"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.559023 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxdcd\" (UniqueName: \"kubernetes.io/projected/ea9637eb-b283-42b2-9089-b0bca2df1b8a-kube-api-access-dxdcd\") pod \"keystone-4468-account-create-vm2dh\" (UID: \"ea9637eb-b283-42b2-9089-b0bca2df1b8a\") " pod="openstack/keystone-4468-account-create-vm2dh" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.559222 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea9637eb-b283-42b2-9089-b0bca2df1b8a-operator-scripts\") pod \"keystone-4468-account-create-vm2dh\" (UID: \"ea9637eb-b283-42b2-9089-b0bca2df1b8a\") " pod="openstack/keystone-4468-account-create-vm2dh" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.560026 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea9637eb-b283-42b2-9089-b0bca2df1b8a-operator-scripts\") pod \"keystone-4468-account-create-vm2dh\" (UID: \"ea9637eb-b283-42b2-9089-b0bca2df1b8a\") " pod="openstack/keystone-4468-account-create-vm2dh" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.596304 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxdcd\" (UniqueName: \"kubernetes.io/projected/ea9637eb-b283-42b2-9089-b0bca2df1b8a-kube-api-access-dxdcd\") pod \"keystone-4468-account-create-vm2dh\" (UID: \"ea9637eb-b283-42b2-9089-b0bca2df1b8a\") " pod="openstack/keystone-4468-account-create-vm2dh" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.641927 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-69c1-account-create-z47b6"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.643458 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69c1-account-create-z47b6" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.650423 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.652920 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69c1-account-create-z47b6"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.653834 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jlj9w" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.668329 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4468-account-create-vm2dh" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.670462 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-984zb\" (UniqueName: \"kubernetes.io/projected/6721ea95-4d66-4dc4-b502-a4e6be931279-kube-api-access-984zb\") pod \"placement-db-create-s5wpj\" (UID: \"6721ea95-4d66-4dc4-b502-a4e6be931279\") " pod="openstack/placement-db-create-s5wpj" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.670658 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6721ea95-4d66-4dc4-b502-a4e6be931279-operator-scripts\") pod \"placement-db-create-s5wpj\" (UID: \"6721ea95-4d66-4dc4-b502-a4e6be931279\") " pod="openstack/placement-db-create-s5wpj" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.741620 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-p7fhq"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.745692 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7fhq" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.775175 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-984zb\" (UniqueName: \"kubernetes.io/projected/6721ea95-4d66-4dc4-b502-a4e6be931279-kube-api-access-984zb\") pod \"placement-db-create-s5wpj\" (UID: \"6721ea95-4d66-4dc4-b502-a4e6be931279\") " pod="openstack/placement-db-create-s5wpj" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.775334 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8db14bf-f245-4b59-a601-b9319b4b2e23-operator-scripts\") pod \"placement-69c1-account-create-z47b6\" (UID: \"c8db14bf-f245-4b59-a601-b9319b4b2e23\") " pod="openstack/placement-69c1-account-create-z47b6" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.775424 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6721ea95-4d66-4dc4-b502-a4e6be931279-operator-scripts\") pod \"placement-db-create-s5wpj\" (UID: \"6721ea95-4d66-4dc4-b502-a4e6be931279\") " pod="openstack/placement-db-create-s5wpj" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.775520 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssnd4\" (UniqueName: \"kubernetes.io/projected/c8db14bf-f245-4b59-a601-b9319b4b2e23-kube-api-access-ssnd4\") pod \"placement-69c1-account-create-z47b6\" (UID: \"c8db14bf-f245-4b59-a601-b9319b4b2e23\") " pod="openstack/placement-69c1-account-create-z47b6" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.776182 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6721ea95-4d66-4dc4-b502-a4e6be931279-operator-scripts\") pod \"placement-db-create-s5wpj\" (UID: \"6721ea95-4d66-4dc4-b502-a4e6be931279\") " pod="openstack/placement-db-create-s5wpj" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.796532 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-984zb\" (UniqueName: \"kubernetes.io/projected/6721ea95-4d66-4dc4-b502-a4e6be931279-kube-api-access-984zb\") pod \"placement-db-create-s5wpj\" (UID: \"6721ea95-4d66-4dc4-b502-a4e6be931279\") " pod="openstack/placement-db-create-s5wpj" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.801635 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-p7fhq"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.859989 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7113-account-create-l65nk"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.864408 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s5wpj" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.868545 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7113-account-create-l65nk" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.878863 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.879680 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8db14bf-f245-4b59-a601-b9319b4b2e23-operator-scripts\") pod \"placement-69c1-account-create-z47b6\" (UID: \"c8db14bf-f245-4b59-a601-b9319b4b2e23\") " pod="openstack/placement-69c1-account-create-z47b6" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.879760 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q46v9\" (UniqueName: \"kubernetes.io/projected/c57d631d-ea34-407d-be71-02c7bf7bc2e4-kube-api-access-q46v9\") pod \"glance-db-create-p7fhq\" (UID: \"c57d631d-ea34-407d-be71-02c7bf7bc2e4\") " pod="openstack/glance-db-create-p7fhq" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.879826 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssnd4\" (UniqueName: \"kubernetes.io/projected/c8db14bf-f245-4b59-a601-b9319b4b2e23-kube-api-access-ssnd4\") pod \"placement-69c1-account-create-z47b6\" (UID: \"c8db14bf-f245-4b59-a601-b9319b4b2e23\") " pod="openstack/placement-69c1-account-create-z47b6" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.879909 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c57d631d-ea34-407d-be71-02c7bf7bc2e4-operator-scripts\") pod \"glance-db-create-p7fhq\" (UID: \"c57d631d-ea34-407d-be71-02c7bf7bc2e4\") " pod="openstack/glance-db-create-p7fhq" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.882064 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8db14bf-f245-4b59-a601-b9319b4b2e23-operator-scripts\") pod \"placement-69c1-account-create-z47b6\" (UID: \"c8db14bf-f245-4b59-a601-b9319b4b2e23\") " pod="openstack/placement-69c1-account-create-z47b6" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.888339 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7113-account-create-l65nk"] Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.912678 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssnd4\" (UniqueName: \"kubernetes.io/projected/c8db14bf-f245-4b59-a601-b9319b4b2e23-kube-api-access-ssnd4\") pod \"placement-69c1-account-create-z47b6\" (UID: \"c8db14bf-f245-4b59-a601-b9319b4b2e23\") " pod="openstack/placement-69c1-account-create-z47b6" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.916112 4965 generic.go:334] "Generic (PLEG): container finished" podID="b2155fa2-3b94-4413-84de-171185b0d253" containerID="c3a5b946b44a637dab4555a17b2766a9c760ec14be5fd4395a0a2479cd6dafaf" exitCode=0 Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.916583 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ae27-account-create-2ng9k" event={"ID":"b2155fa2-3b94-4413-84de-171185b0d253","Type":"ContainerDied","Data":"c3a5b946b44a637dab4555a17b2766a9c760ec14be5fd4395a0a2479cd6dafaf"} Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.926377 4965 generic.go:334] "Generic (PLEG): container finished" podID="61094cee-413d-4b91-ace8-69afdbaa6226" containerID="446a0de38926fe665a87a5e88a9915badb47641538bd91c541808557d625bb7b" exitCode=0 Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.926440 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f6bkw" event={"ID":"61094cee-413d-4b91-ace8-69afdbaa6226","Type":"ContainerDied","Data":"446a0de38926fe665a87a5e88a9915badb47641538bd91c541808557d625bb7b"} Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.938093 4965 generic.go:334] "Generic (PLEG): container finished" podID="6c898509-daeb-4566-b5b1-39f4b742b5d0" containerID="5bdb486b59ee958aed2ea02dd64e3833a4ae9389982b8559b3f55aac39793306" exitCode=0 Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.938303 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k2wml" event={"ID":"6c898509-daeb-4566-b5b1-39f4b742b5d0","Type":"ContainerDied","Data":"5bdb486b59ee958aed2ea02dd64e3833a4ae9389982b8559b3f55aac39793306"} Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.941622 4965 generic.go:334] "Generic (PLEG): container finished" podID="59ff1c96-a8a8-431e-ab9b-a76d3992463f" containerID="1030d7dc06e8080135fd865c891265f319d7a05846ef1304b0e7a6b0668b8f96" exitCode=0 Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.941883 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-17b8-account-create-f8lf2" event={"ID":"59ff1c96-a8a8-431e-ab9b-a76d3992463f","Type":"ContainerDied","Data":"1030d7dc06e8080135fd865c891265f319d7a05846ef1304b0e7a6b0668b8f96"} Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.949346 4965 generic.go:334] "Generic (PLEG): container finished" podID="656770af-f30e-490a-9987-71cc29c5e278" containerID="0d489402ee0d82ef91bca3b81442c53d1028892727879e49e5b32a9c9f7e450d" exitCode=0 Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.949423 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-krr6h" event={"ID":"656770af-f30e-490a-9987-71cc29c5e278","Type":"ContainerDied","Data":"0d489402ee0d82ef91bca3b81442c53d1028892727879e49e5b32a9c9f7e450d"} Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.956494 4965 generic.go:334] "Generic (PLEG): container finished" podID="93df8a0d-1ef0-4598-8253-92c5bbeb95aa" containerID="d6b6591730a09518e037ffc330a8fc32d7d615522fbfebf00dba44422303a8fb" exitCode=0 Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.956554 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a38e-account-create-jh8kh" event={"ID":"93df8a0d-1ef0-4598-8253-92c5bbeb95aa","Type":"ContainerDied","Data":"d6b6591730a09518e037ffc330a8fc32d7d615522fbfebf00dba44422303a8fb"} Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.964501 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69c1-account-create-z47b6" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.983605 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6z2t\" (UniqueName: \"kubernetes.io/projected/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-kube-api-access-p6z2t\") pod \"glance-7113-account-create-l65nk\" (UID: \"fb1d2b24-a9a4-4fde-9a7d-6875a537887f\") " pod="openstack/glance-7113-account-create-l65nk" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.983687 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q46v9\" (UniqueName: \"kubernetes.io/projected/c57d631d-ea34-407d-be71-02c7bf7bc2e4-kube-api-access-q46v9\") pod \"glance-db-create-p7fhq\" (UID: \"c57d631d-ea34-407d-be71-02c7bf7bc2e4\") " pod="openstack/glance-db-create-p7fhq" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.983743 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-operator-scripts\") pod \"glance-7113-account-create-l65nk\" (UID: \"fb1d2b24-a9a4-4fde-9a7d-6875a537887f\") " pod="openstack/glance-7113-account-create-l65nk" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.983779 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c57d631d-ea34-407d-be71-02c7bf7bc2e4-operator-scripts\") pod \"glance-db-create-p7fhq\" (UID: \"c57d631d-ea34-407d-be71-02c7bf7bc2e4\") " pod="openstack/glance-db-create-p7fhq" Nov 25 15:25:22 crc kubenswrapper[4965]: I1125 15:25:22.984464 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c57d631d-ea34-407d-be71-02c7bf7bc2e4-operator-scripts\") pod \"glance-db-create-p7fhq\" (UID: \"c57d631d-ea34-407d-be71-02c7bf7bc2e4\") " pod="openstack/glance-db-create-p7fhq" Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.012690 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q46v9\" (UniqueName: \"kubernetes.io/projected/c57d631d-ea34-407d-be71-02c7bf7bc2e4-kube-api-access-q46v9\") pod \"glance-db-create-p7fhq\" (UID: \"c57d631d-ea34-407d-be71-02c7bf7bc2e4\") " pod="openstack/glance-db-create-p7fhq" Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.086933 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-operator-scripts\") pod \"glance-7113-account-create-l65nk\" (UID: \"fb1d2b24-a9a4-4fde-9a7d-6875a537887f\") " pod="openstack/glance-7113-account-create-l65nk" Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.087150 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6z2t\" (UniqueName: \"kubernetes.io/projected/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-kube-api-access-p6z2t\") pod \"glance-7113-account-create-l65nk\" (UID: \"fb1d2b24-a9a4-4fde-9a7d-6875a537887f\") " pod="openstack/glance-7113-account-create-l65nk" Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.088639 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-operator-scripts\") pod \"glance-7113-account-create-l65nk\" (UID: \"fb1d2b24-a9a4-4fde-9a7d-6875a537887f\") " pod="openstack/glance-7113-account-create-l65nk" Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.101223 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7fhq" Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.105778 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6z2t\" (UniqueName: \"kubernetes.io/projected/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-kube-api-access-p6z2t\") pod \"glance-7113-account-create-l65nk\" (UID: \"fb1d2b24-a9a4-4fde-9a7d-6875a537887f\") " pod="openstack/glance-7113-account-create-l65nk" Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.223416 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7113-account-create-l65nk" Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.229630 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jlj9w"] Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.304402 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4468-account-create-vm2dh"] Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.378321 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s5wpj"] Nov 25 15:25:23 crc kubenswrapper[4965]: W1125 15:25:23.404827 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6721ea95_4d66_4dc4_b502_a4e6be931279.slice/crio-0381c09a1919c34e414e897bf26772f3c8585e82faf55760b6c46da28c3a3c40 WatchSource:0}: Error finding container 0381c09a1919c34e414e897bf26772f3c8585e82faf55760b6c46da28c3a3c40: Status 404 returned error can't find the container with id 0381c09a1919c34e414e897bf26772f3c8585e82faf55760b6c46da28c3a3c40 Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.565373 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69c1-account-create-z47b6"] Nov 25 15:25:23 crc kubenswrapper[4965]: W1125 15:25:23.574178 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8db14bf_f245_4b59_a601_b9319b4b2e23.slice/crio-14fc922cf832463af8380ce3ce599580a0933676f3ad38237a32873a31606663 WatchSource:0}: Error finding container 14fc922cf832463af8380ce3ce599580a0933676f3ad38237a32873a31606663: Status 404 returned error can't find the container with id 14fc922cf832463af8380ce3ce599580a0933676f3ad38237a32873a31606663 Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.683174 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-p7fhq"] Nov 25 15:25:23 crc kubenswrapper[4965]: W1125 15:25:23.685470 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc57d631d_ea34_407d_be71_02c7bf7bc2e4.slice/crio-169ac05e2a7f935c7d36b2b85cb751dd1b07328bf4d2ab476038cf7baf6bbb2b WatchSource:0}: Error finding container 169ac05e2a7f935c7d36b2b85cb751dd1b07328bf4d2ab476038cf7baf6bbb2b: Status 404 returned error can't find the container with id 169ac05e2a7f935c7d36b2b85cb751dd1b07328bf4d2ab476038cf7baf6bbb2b Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.774640 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7113-account-create-l65nk"] Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.964665 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s5wpj" event={"ID":"6721ea95-4d66-4dc4-b502-a4e6be931279","Type":"ContainerStarted","Data":"9baa1a982883b23ace0afeafe8e26054c0e2072ed06492309cd844484d0f5401"} Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.965027 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s5wpj" event={"ID":"6721ea95-4d66-4dc4-b502-a4e6be931279","Type":"ContainerStarted","Data":"0381c09a1919c34e414e897bf26772f3c8585e82faf55760b6c46da28c3a3c40"} Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.967138 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4468-account-create-vm2dh" event={"ID":"ea9637eb-b283-42b2-9089-b0bca2df1b8a","Type":"ContainerStarted","Data":"8caadc073e744616463e2f51ef22e4fc57e6b6067ab93eee7ddcb6baaf14875c"} Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.967176 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4468-account-create-vm2dh" event={"ID":"ea9637eb-b283-42b2-9089-b0bca2df1b8a","Type":"ContainerStarted","Data":"91f245dddfc0c9dac1a571a72f2eedde36c15f9010d99c4ceb9f8292bb0dd45a"} Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.968747 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p7fhq" event={"ID":"c57d631d-ea34-407d-be71-02c7bf7bc2e4","Type":"ContainerStarted","Data":"169ac05e2a7f935c7d36b2b85cb751dd1b07328bf4d2ab476038cf7baf6bbb2b"} Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.970342 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7113-account-create-l65nk" event={"ID":"fb1d2b24-a9a4-4fde-9a7d-6875a537887f","Type":"ContainerStarted","Data":"a805fdfcc5a3a142da6caa2d53e7b565385c4371bd013ad0166f92cbf5db0345"} Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.971922 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jlj9w" event={"ID":"78ed68a6-1a16-4e43-86a2-7901c1c7aa35","Type":"ContainerStarted","Data":"743f5cfd5ad15220384b64b72b0920716e12c76ca301428ad251c152329d78dd"} Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.971959 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jlj9w" event={"ID":"78ed68a6-1a16-4e43-86a2-7901c1c7aa35","Type":"ContainerStarted","Data":"2372eb9547baea413d4b8a44209310ed7a38243df7350d00b199029af2849ef2"} Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.974029 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69c1-account-create-z47b6" event={"ID":"c8db14bf-f245-4b59-a601-b9319b4b2e23","Type":"ContainerStarted","Data":"9fdea94f82c0c0d97362c83c9514d5dbeed0dd6311105b5d146c7d2da5c965a6"} Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.974067 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69c1-account-create-z47b6" event={"ID":"c8db14bf-f245-4b59-a601-b9319b4b2e23","Type":"ContainerStarted","Data":"14fc922cf832463af8380ce3ce599580a0933676f3ad38237a32873a31606663"} Nov 25 15:25:23 crc kubenswrapper[4965]: I1125 15:25:23.998020 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-s5wpj" podStartSLOduration=1.997995191 podStartE2EDuration="1.997995191s" podCreationTimestamp="2025-11-25 15:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:25:23.983417804 +0000 UTC m=+1268.951011550" watchObservedRunningTime="2025-11-25 15:25:23.997995191 +0000 UTC m=+1268.965588937" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.318463 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f6bkw" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.423920 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tth5\" (UniqueName: \"kubernetes.io/projected/61094cee-413d-4b91-ace8-69afdbaa6226-kube-api-access-4tth5\") pod \"61094cee-413d-4b91-ace8-69afdbaa6226\" (UID: \"61094cee-413d-4b91-ace8-69afdbaa6226\") " Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.424225 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61094cee-413d-4b91-ace8-69afdbaa6226-operator-scripts\") pod \"61094cee-413d-4b91-ace8-69afdbaa6226\" (UID: \"61094cee-413d-4b91-ace8-69afdbaa6226\") " Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.430558 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61094cee-413d-4b91-ace8-69afdbaa6226-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61094cee-413d-4b91-ace8-69afdbaa6226" (UID: "61094cee-413d-4b91-ace8-69afdbaa6226"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.440871 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61094cee-413d-4b91-ace8-69afdbaa6226-kube-api-access-4tth5" (OuterVolumeSpecName: "kube-api-access-4tth5") pod "61094cee-413d-4b91-ace8-69afdbaa6226" (UID: "61094cee-413d-4b91-ace8-69afdbaa6226"). InnerVolumeSpecName "kube-api-access-4tth5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.527680 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61094cee-413d-4b91-ace8-69afdbaa6226-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.528502 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tth5\" (UniqueName: \"kubernetes.io/projected/61094cee-413d-4b91-ace8-69afdbaa6226-kube-api-access-4tth5\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.843803 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-krr6h" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.857206 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-17b8-account-create-f8lf2" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.868691 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a38e-account-create-jh8kh" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.899943 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k2wml" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.908089 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ae27-account-create-2ng9k" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.942705 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5gg7\" (UniqueName: \"kubernetes.io/projected/59ff1c96-a8a8-431e-ab9b-a76d3992463f-kube-api-access-w5gg7\") pod \"59ff1c96-a8a8-431e-ab9b-a76d3992463f\" (UID: \"59ff1c96-a8a8-431e-ab9b-a76d3992463f\") " Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.943072 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9vjn\" (UniqueName: \"kubernetes.io/projected/656770af-f30e-490a-9987-71cc29c5e278-kube-api-access-n9vjn\") pod \"656770af-f30e-490a-9987-71cc29c5e278\" (UID: \"656770af-f30e-490a-9987-71cc29c5e278\") " Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.943293 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/656770af-f30e-490a-9987-71cc29c5e278-operator-scripts\") pod \"656770af-f30e-490a-9987-71cc29c5e278\" (UID: \"656770af-f30e-490a-9987-71cc29c5e278\") " Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.943894 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msrkw\" (UniqueName: \"kubernetes.io/projected/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-kube-api-access-msrkw\") pod \"93df8a0d-1ef0-4598-8253-92c5bbeb95aa\" (UID: \"93df8a0d-1ef0-4598-8253-92c5bbeb95aa\") " Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.943803 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/656770af-f30e-490a-9987-71cc29c5e278-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "656770af-f30e-490a-9987-71cc29c5e278" (UID: "656770af-f30e-490a-9987-71cc29c5e278"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.944134 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-operator-scripts\") pod \"93df8a0d-1ef0-4598-8253-92c5bbeb95aa\" (UID: \"93df8a0d-1ef0-4598-8253-92c5bbeb95aa\") " Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.944516 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ff1c96-a8a8-431e-ab9b-a76d3992463f-operator-scripts\") pod \"59ff1c96-a8a8-431e-ab9b-a76d3992463f\" (UID: \"59ff1c96-a8a8-431e-ab9b-a76d3992463f\") " Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.945188 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/656770af-f30e-490a-9987-71cc29c5e278-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.945196 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93df8a0d-1ef0-4598-8253-92c5bbeb95aa" (UID: "93df8a0d-1ef0-4598-8253-92c5bbeb95aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.945611 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ff1c96-a8a8-431e-ab9b-a76d3992463f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59ff1c96-a8a8-431e-ab9b-a76d3992463f" (UID: "59ff1c96-a8a8-431e-ab9b-a76d3992463f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.956136 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/656770af-f30e-490a-9987-71cc29c5e278-kube-api-access-n9vjn" (OuterVolumeSpecName: "kube-api-access-n9vjn") pod "656770af-f30e-490a-9987-71cc29c5e278" (UID: "656770af-f30e-490a-9987-71cc29c5e278"). InnerVolumeSpecName "kube-api-access-n9vjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.956481 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ff1c96-a8a8-431e-ab9b-a76d3992463f-kube-api-access-w5gg7" (OuterVolumeSpecName: "kube-api-access-w5gg7") pod "59ff1c96-a8a8-431e-ab9b-a76d3992463f" (UID: "59ff1c96-a8a8-431e-ab9b-a76d3992463f"). InnerVolumeSpecName "kube-api-access-w5gg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.957666 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-kube-api-access-msrkw" (OuterVolumeSpecName: "kube-api-access-msrkw") pod "93df8a0d-1ef0-4598-8253-92c5bbeb95aa" (UID: "93df8a0d-1ef0-4598-8253-92c5bbeb95aa"). InnerVolumeSpecName "kube-api-access-msrkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.984352 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-17b8-account-create-f8lf2" event={"ID":"59ff1c96-a8a8-431e-ab9b-a76d3992463f","Type":"ContainerDied","Data":"c68add4d133378421d45351b6a0d688da835466a7769c870326f2edf2354767e"} Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.984393 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c68add4d133378421d45351b6a0d688da835466a7769c870326f2edf2354767e" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.984450 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-17b8-account-create-f8lf2" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.990329 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f6bkw" event={"ID":"61094cee-413d-4b91-ace8-69afdbaa6226","Type":"ContainerDied","Data":"7c9d7352c9333111cff5e7c54aece7b0c625c0e1a2688d7fc5259cfb74f657a9"} Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.990376 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c9d7352c9333111cff5e7c54aece7b0c625c0e1a2688d7fc5259cfb74f657a9" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.990439 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f6bkw" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.992684 4965 generic.go:334] "Generic (PLEG): container finished" podID="ea9637eb-b283-42b2-9089-b0bca2df1b8a" containerID="8caadc073e744616463e2f51ef22e4fc57e6b6067ab93eee7ddcb6baaf14875c" exitCode=0 Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.992760 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4468-account-create-vm2dh" event={"ID":"ea9637eb-b283-42b2-9089-b0bca2df1b8a","Type":"ContainerDied","Data":"8caadc073e744616463e2f51ef22e4fc57e6b6067ab93eee7ddcb6baaf14875c"} Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.996720 4965 generic.go:334] "Generic (PLEG): container finished" podID="c57d631d-ea34-407d-be71-02c7bf7bc2e4" containerID="5cda8458a5634f0c2eb16b9ac703fa498b930b59e5454f2d8ea7320a6deb7dd6" exitCode=0 Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.996826 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p7fhq" event={"ID":"c57d631d-ea34-407d-be71-02c7bf7bc2e4","Type":"ContainerDied","Data":"5cda8458a5634f0c2eb16b9ac703fa498b930b59e5454f2d8ea7320a6deb7dd6"} Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.999448 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k2wml" event={"ID":"6c898509-daeb-4566-b5b1-39f4b742b5d0","Type":"ContainerDied","Data":"a8737c624fc387eab4d553642886cb618cec8bd68e5af93b0c94a96b2a9ada93"} Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.999517 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8737c624fc387eab4d553642886cb618cec8bd68e5af93b0c94a96b2a9ada93" Nov 25 15:25:24 crc kubenswrapper[4965]: I1125 15:25:24.999607 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k2wml" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.002417 4965 generic.go:334] "Generic (PLEG): container finished" podID="c8db14bf-f245-4b59-a601-b9319b4b2e23" containerID="9fdea94f82c0c0d97362c83c9514d5dbeed0dd6311105b5d146c7d2da5c965a6" exitCode=0 Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.002541 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69c1-account-create-z47b6" event={"ID":"c8db14bf-f245-4b59-a601-b9319b4b2e23","Type":"ContainerDied","Data":"9fdea94f82c0c0d97362c83c9514d5dbeed0dd6311105b5d146c7d2da5c965a6"} Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.005137 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ae27-account-create-2ng9k" event={"ID":"b2155fa2-3b94-4413-84de-171185b0d253","Type":"ContainerDied","Data":"8493723935d6c511524e335baaf35c2d479ac17adf196ce415ba3c71a8540dbf"} Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.005193 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8493723935d6c511524e335baaf35c2d479ac17adf196ce415ba3c71a8540dbf" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.005372 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ae27-account-create-2ng9k" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.007360 4965 generic.go:334] "Generic (PLEG): container finished" podID="6721ea95-4d66-4dc4-b502-a4e6be931279" containerID="9baa1a982883b23ace0afeafe8e26054c0e2072ed06492309cd844484d0f5401" exitCode=0 Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.008735 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s5wpj" event={"ID":"6721ea95-4d66-4dc4-b502-a4e6be931279","Type":"ContainerDied","Data":"9baa1a982883b23ace0afeafe8e26054c0e2072ed06492309cd844484d0f5401"} Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.014318 4965 generic.go:334] "Generic (PLEG): container finished" podID="fb1d2b24-a9a4-4fde-9a7d-6875a537887f" containerID="80cefc3a27566e1dd1c5c312795fd602136a0cc9094b1586d81dd01cb2d77937" exitCode=0 Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.014593 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7113-account-create-l65nk" event={"ID":"fb1d2b24-a9a4-4fde-9a7d-6875a537887f","Type":"ContainerDied","Data":"80cefc3a27566e1dd1c5c312795fd602136a0cc9094b1586d81dd01cb2d77937"} Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.016947 4965 generic.go:334] "Generic (PLEG): container finished" podID="78ed68a6-1a16-4e43-86a2-7901c1c7aa35" containerID="743f5cfd5ad15220384b64b72b0920716e12c76ca301428ad251c152329d78dd" exitCode=0 Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.017096 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jlj9w" event={"ID":"78ed68a6-1a16-4e43-86a2-7901c1c7aa35","Type":"ContainerDied","Data":"743f5cfd5ad15220384b64b72b0920716e12c76ca301428ad251c152329d78dd"} Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.022152 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-krr6h" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.022589 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-krr6h" event={"ID":"656770af-f30e-490a-9987-71cc29c5e278","Type":"ContainerDied","Data":"50b15f4249d44dc1b2854d915c720690132c9c1817a4e852c1f9841e8a9104b0"} Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.022853 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50b15f4249d44dc1b2854d915c720690132c9c1817a4e852c1f9841e8a9104b0" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.034177 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a38e-account-create-jh8kh" event={"ID":"93df8a0d-1ef0-4598-8253-92c5bbeb95aa","Type":"ContainerDied","Data":"a8ca10322411fb849f304f7c51ece574f1d367303eb9c36e3d8aa51cb1cf81ac"} Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.035321 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8ca10322411fb849f304f7c51ece574f1d367303eb9c36e3d8aa51cb1cf81ac" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.035493 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a38e-account-create-jh8kh" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.045813 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2155fa2-3b94-4413-84de-171185b0d253-operator-scripts\") pod \"b2155fa2-3b94-4413-84de-171185b0d253\" (UID: \"b2155fa2-3b94-4413-84de-171185b0d253\") " Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.045950 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbnwn\" (UniqueName: \"kubernetes.io/projected/b2155fa2-3b94-4413-84de-171185b0d253-kube-api-access-zbnwn\") pod \"b2155fa2-3b94-4413-84de-171185b0d253\" (UID: \"b2155fa2-3b94-4413-84de-171185b0d253\") " Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.046090 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c5ln\" (UniqueName: \"kubernetes.io/projected/6c898509-daeb-4566-b5b1-39f4b742b5d0-kube-api-access-5c5ln\") pod \"6c898509-daeb-4566-b5b1-39f4b742b5d0\" (UID: \"6c898509-daeb-4566-b5b1-39f4b742b5d0\") " Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.046129 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c898509-daeb-4566-b5b1-39f4b742b5d0-operator-scripts\") pod \"6c898509-daeb-4566-b5b1-39f4b742b5d0\" (UID: \"6c898509-daeb-4566-b5b1-39f4b742b5d0\") " Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.046663 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msrkw\" (UniqueName: \"kubernetes.io/projected/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-kube-api-access-msrkw\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.046679 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93df8a0d-1ef0-4598-8253-92c5bbeb95aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.046690 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ff1c96-a8a8-431e-ab9b-a76d3992463f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.046699 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5gg7\" (UniqueName: \"kubernetes.io/projected/59ff1c96-a8a8-431e-ab9b-a76d3992463f-kube-api-access-w5gg7\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.046709 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9vjn\" (UniqueName: \"kubernetes.io/projected/656770af-f30e-490a-9987-71cc29c5e278-kube-api-access-n9vjn\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.048836 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2155fa2-3b94-4413-84de-171185b0d253-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2155fa2-3b94-4413-84de-171185b0d253" (UID: "b2155fa2-3b94-4413-84de-171185b0d253"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.051063 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c898509-daeb-4566-b5b1-39f4b742b5d0-kube-api-access-5c5ln" (OuterVolumeSpecName: "kube-api-access-5c5ln") pod "6c898509-daeb-4566-b5b1-39f4b742b5d0" (UID: "6c898509-daeb-4566-b5b1-39f4b742b5d0"). InnerVolumeSpecName "kube-api-access-5c5ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.052990 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c898509-daeb-4566-b5b1-39f4b742b5d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c898509-daeb-4566-b5b1-39f4b742b5d0" (UID: "6c898509-daeb-4566-b5b1-39f4b742b5d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.055778 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2155fa2-3b94-4413-84de-171185b0d253-kube-api-access-zbnwn" (OuterVolumeSpecName: "kube-api-access-zbnwn") pod "b2155fa2-3b94-4413-84de-171185b0d253" (UID: "b2155fa2-3b94-4413-84de-171185b0d253"). InnerVolumeSpecName "kube-api-access-zbnwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.148112 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbnwn\" (UniqueName: \"kubernetes.io/projected/b2155fa2-3b94-4413-84de-171185b0d253-kube-api-access-zbnwn\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.148151 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c5ln\" (UniqueName: \"kubernetes.io/projected/6c898509-daeb-4566-b5b1-39f4b742b5d0-kube-api-access-5c5ln\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.148165 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c898509-daeb-4566-b5b1-39f4b742b5d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:25 crc kubenswrapper[4965]: I1125 15:25:25.148177 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2155fa2-3b94-4413-84de-171185b0d253-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.044663 4965 generic.go:334] "Generic (PLEG): container finished" podID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerID="757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d" exitCode=0 Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.044685 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"67d0186d-7eca-48a0-9cc8-56ce4d1caa38","Type":"ContainerDied","Data":"757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d"} Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.451656 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7fhq" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.576600 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q46v9\" (UniqueName: \"kubernetes.io/projected/c57d631d-ea34-407d-be71-02c7bf7bc2e4-kube-api-access-q46v9\") pod \"c57d631d-ea34-407d-be71-02c7bf7bc2e4\" (UID: \"c57d631d-ea34-407d-be71-02c7bf7bc2e4\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.576703 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c57d631d-ea34-407d-be71-02c7bf7bc2e4-operator-scripts\") pod \"c57d631d-ea34-407d-be71-02c7bf7bc2e4\" (UID: \"c57d631d-ea34-407d-be71-02c7bf7bc2e4\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.579143 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c57d631d-ea34-407d-be71-02c7bf7bc2e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c57d631d-ea34-407d-be71-02c7bf7bc2e4" (UID: "c57d631d-ea34-407d-be71-02c7bf7bc2e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.581684 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57d631d-ea34-407d-be71-02c7bf7bc2e4-kube-api-access-q46v9" (OuterVolumeSpecName: "kube-api-access-q46v9") pod "c57d631d-ea34-407d-be71-02c7bf7bc2e4" (UID: "c57d631d-ea34-407d-be71-02c7bf7bc2e4"). InnerVolumeSpecName "kube-api-access-q46v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.663644 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7113-account-create-l65nk" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.671172 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s5wpj" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.678156 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q46v9\" (UniqueName: \"kubernetes.io/projected/c57d631d-ea34-407d-be71-02c7bf7bc2e4-kube-api-access-q46v9\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.678181 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c57d631d-ea34-407d-be71-02c7bf7bc2e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.687601 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4468-account-create-vm2dh" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.721741 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jlj9w" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.733008 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69c1-account-create-z47b6" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.789769 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98n99\" (UniqueName: \"kubernetes.io/projected/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-kube-api-access-98n99\") pod \"78ed68a6-1a16-4e43-86a2-7901c1c7aa35\" (UID: \"78ed68a6-1a16-4e43-86a2-7901c1c7aa35\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.789825 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea9637eb-b283-42b2-9089-b0bca2df1b8a-operator-scripts\") pod \"ea9637eb-b283-42b2-9089-b0bca2df1b8a\" (UID: \"ea9637eb-b283-42b2-9089-b0bca2df1b8a\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.790048 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-984zb\" (UniqueName: \"kubernetes.io/projected/6721ea95-4d66-4dc4-b502-a4e6be931279-kube-api-access-984zb\") pod \"6721ea95-4d66-4dc4-b502-a4e6be931279\" (UID: \"6721ea95-4d66-4dc4-b502-a4e6be931279\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.790139 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6721ea95-4d66-4dc4-b502-a4e6be931279-operator-scripts\") pod \"6721ea95-4d66-4dc4-b502-a4e6be931279\" (UID: \"6721ea95-4d66-4dc4-b502-a4e6be931279\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.790168 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6z2t\" (UniqueName: \"kubernetes.io/projected/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-kube-api-access-p6z2t\") pod \"fb1d2b24-a9a4-4fde-9a7d-6875a537887f\" (UID: \"fb1d2b24-a9a4-4fde-9a7d-6875a537887f\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.790192 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-operator-scripts\") pod \"fb1d2b24-a9a4-4fde-9a7d-6875a537887f\" (UID: \"fb1d2b24-a9a4-4fde-9a7d-6875a537887f\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.790233 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxdcd\" (UniqueName: \"kubernetes.io/projected/ea9637eb-b283-42b2-9089-b0bca2df1b8a-kube-api-access-dxdcd\") pod \"ea9637eb-b283-42b2-9089-b0bca2df1b8a\" (UID: \"ea9637eb-b283-42b2-9089-b0bca2df1b8a\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.790256 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-operator-scripts\") pod \"78ed68a6-1a16-4e43-86a2-7901c1c7aa35\" (UID: \"78ed68a6-1a16-4e43-86a2-7901c1c7aa35\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.791171 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6721ea95-4d66-4dc4-b502-a4e6be931279-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6721ea95-4d66-4dc4-b502-a4e6be931279" (UID: "6721ea95-4d66-4dc4-b502-a4e6be931279"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.791210 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78ed68a6-1a16-4e43-86a2-7901c1c7aa35" (UID: "78ed68a6-1a16-4e43-86a2-7901c1c7aa35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.791918 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb1d2b24-a9a4-4fde-9a7d-6875a537887f" (UID: "fb1d2b24-a9a4-4fde-9a7d-6875a537887f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.794871 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea9637eb-b283-42b2-9089-b0bca2df1b8a-kube-api-access-dxdcd" (OuterVolumeSpecName: "kube-api-access-dxdcd") pod "ea9637eb-b283-42b2-9089-b0bca2df1b8a" (UID: "ea9637eb-b283-42b2-9089-b0bca2df1b8a"). InnerVolumeSpecName "kube-api-access-dxdcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.795437 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea9637eb-b283-42b2-9089-b0bca2df1b8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea9637eb-b283-42b2-9089-b0bca2df1b8a" (UID: "ea9637eb-b283-42b2-9089-b0bca2df1b8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.797307 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-kube-api-access-98n99" (OuterVolumeSpecName: "kube-api-access-98n99") pod "78ed68a6-1a16-4e43-86a2-7901c1c7aa35" (UID: "78ed68a6-1a16-4e43-86a2-7901c1c7aa35"). InnerVolumeSpecName "kube-api-access-98n99". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.801567 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-kube-api-access-p6z2t" (OuterVolumeSpecName: "kube-api-access-p6z2t") pod "fb1d2b24-a9a4-4fde-9a7d-6875a537887f" (UID: "fb1d2b24-a9a4-4fde-9a7d-6875a537887f"). InnerVolumeSpecName "kube-api-access-p6z2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.806147 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6721ea95-4d66-4dc4-b502-a4e6be931279-kube-api-access-984zb" (OuterVolumeSpecName: "kube-api-access-984zb") pod "6721ea95-4d66-4dc4-b502-a4e6be931279" (UID: "6721ea95-4d66-4dc4-b502-a4e6be931279"). InnerVolumeSpecName "kube-api-access-984zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.891223 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8db14bf-f245-4b59-a601-b9319b4b2e23-operator-scripts\") pod \"c8db14bf-f245-4b59-a601-b9319b4b2e23\" (UID: \"c8db14bf-f245-4b59-a601-b9319b4b2e23\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.891558 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssnd4\" (UniqueName: \"kubernetes.io/projected/c8db14bf-f245-4b59-a601-b9319b4b2e23-kube-api-access-ssnd4\") pod \"c8db14bf-f245-4b59-a601-b9319b4b2e23\" (UID: \"c8db14bf-f245-4b59-a601-b9319b4b2e23\") " Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.891955 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8db14bf-f245-4b59-a601-b9319b4b2e23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8db14bf-f245-4b59-a601-b9319b4b2e23" (UID: "c8db14bf-f245-4b59-a601-b9319b4b2e23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.892578 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-984zb\" (UniqueName: \"kubernetes.io/projected/6721ea95-4d66-4dc4-b502-a4e6be931279-kube-api-access-984zb\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.893226 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8db14bf-f245-4b59-a601-b9319b4b2e23-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.893307 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6721ea95-4d66-4dc4-b502-a4e6be931279-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.893372 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6z2t\" (UniqueName: \"kubernetes.io/projected/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-kube-api-access-p6z2t\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.893497 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1d2b24-a9a4-4fde-9a7d-6875a537887f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.893553 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxdcd\" (UniqueName: \"kubernetes.io/projected/ea9637eb-b283-42b2-9089-b0bca2df1b8a-kube-api-access-dxdcd\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.893614 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.893705 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98n99\" (UniqueName: \"kubernetes.io/projected/78ed68a6-1a16-4e43-86a2-7901c1c7aa35-kube-api-access-98n99\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.893745 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea9637eb-b283-42b2-9089-b0bca2df1b8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.895565 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8db14bf-f245-4b59-a601-b9319b4b2e23-kube-api-access-ssnd4" (OuterVolumeSpecName: "kube-api-access-ssnd4") pod "c8db14bf-f245-4b59-a601-b9319b4b2e23" (UID: "c8db14bf-f245-4b59-a601-b9319b4b2e23"). InnerVolumeSpecName "kube-api-access-ssnd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:26 crc kubenswrapper[4965]: I1125 15:25:26.995262 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssnd4\" (UniqueName: \"kubernetes.io/projected/c8db14bf-f245-4b59-a601-b9319b4b2e23-kube-api-access-ssnd4\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.055621 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p7fhq" event={"ID":"c57d631d-ea34-407d-be71-02c7bf7bc2e4","Type":"ContainerDied","Data":"169ac05e2a7f935c7d36b2b85cb751dd1b07328bf4d2ab476038cf7baf6bbb2b"} Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.055669 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169ac05e2a7f935c7d36b2b85cb751dd1b07328bf4d2ab476038cf7baf6bbb2b" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.055855 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7fhq" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.061313 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7113-account-create-l65nk" event={"ID":"fb1d2b24-a9a4-4fde-9a7d-6875a537887f","Type":"ContainerDied","Data":"a805fdfcc5a3a142da6caa2d53e7b565385c4371bd013ad0166f92cbf5db0345"} Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.061668 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a805fdfcc5a3a142da6caa2d53e7b565385c4371bd013ad0166f92cbf5db0345" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.061534 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7113-account-create-l65nk" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.064924 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jlj9w" event={"ID":"78ed68a6-1a16-4e43-86a2-7901c1c7aa35","Type":"ContainerDied","Data":"2372eb9547baea413d4b8a44209310ed7a38243df7350d00b199029af2849ef2"} Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.064961 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2372eb9547baea413d4b8a44209310ed7a38243df7350d00b199029af2849ef2" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.065110 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jlj9w" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.067386 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69c1-account-create-z47b6" event={"ID":"c8db14bf-f245-4b59-a601-b9319b4b2e23","Type":"ContainerDied","Data":"14fc922cf832463af8380ce3ce599580a0933676f3ad38237a32873a31606663"} Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.067414 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14fc922cf832463af8380ce3ce599580a0933676f3ad38237a32873a31606663" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.067565 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69c1-account-create-z47b6" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.068714 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s5wpj" event={"ID":"6721ea95-4d66-4dc4-b502-a4e6be931279","Type":"ContainerDied","Data":"0381c09a1919c34e414e897bf26772f3c8585e82faf55760b6c46da28c3a3c40"} Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.068761 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0381c09a1919c34e414e897bf26772f3c8585e82faf55760b6c46da28c3a3c40" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.068730 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s5wpj" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.070141 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"67d0186d-7eca-48a0-9cc8-56ce4d1caa38","Type":"ContainerStarted","Data":"2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6"} Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.070308 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.073146 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4468-account-create-vm2dh" event={"ID":"ea9637eb-b283-42b2-9089-b0bca2df1b8a","Type":"ContainerDied","Data":"91f245dddfc0c9dac1a571a72f2eedde36c15f9010d99c4ceb9f8292bb0dd45a"} Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.073174 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91f245dddfc0c9dac1a571a72f2eedde36c15f9010d99c4ceb9f8292bb0dd45a" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.073225 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4468-account-create-vm2dh" Nov 25 15:25:27 crc kubenswrapper[4965]: I1125 15:25:27.105666 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371926.74913 podStartE2EDuration="1m50.10564499s" podCreationTimestamp="2025-11-25 15:23:37 +0000 UTC" firstStartedPulling="2025-11-25 15:23:39.683195511 +0000 UTC m=+1164.650789257" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:25:27.105051914 +0000 UTC m=+1272.072645660" watchObservedRunningTime="2025-11-25 15:25:27.10564499 +0000 UTC m=+1272.073238736" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.181810 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-cr787"] Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201369 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ed68a6-1a16-4e43-86a2-7901c1c7aa35" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201421 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ed68a6-1a16-4e43-86a2-7901c1c7aa35" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201438 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61094cee-413d-4b91-ace8-69afdbaa6226" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201445 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="61094cee-413d-4b91-ace8-69afdbaa6226" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201468 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8db14bf-f245-4b59-a601-b9319b4b2e23" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201475 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8db14bf-f245-4b59-a601-b9319b4b2e23" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201503 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="656770af-f30e-490a-9987-71cc29c5e278" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201509 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="656770af-f30e-490a-9987-71cc29c5e278" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201523 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93df8a0d-1ef0-4598-8253-92c5bbeb95aa" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201529 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="93df8a0d-1ef0-4598-8253-92c5bbeb95aa" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201535 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57d631d-ea34-407d-be71-02c7bf7bc2e4" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201546 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57d631d-ea34-407d-be71-02c7bf7bc2e4" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201577 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ff1c96-a8a8-431e-ab9b-a76d3992463f" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201582 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ff1c96-a8a8-431e-ab9b-a76d3992463f" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201596 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2155fa2-3b94-4413-84de-171185b0d253" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201602 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2155fa2-3b94-4413-84de-171185b0d253" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201615 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c898509-daeb-4566-b5b1-39f4b742b5d0" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201621 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c898509-daeb-4566-b5b1-39f4b742b5d0" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201639 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1d2b24-a9a4-4fde-9a7d-6875a537887f" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201645 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1d2b24-a9a4-4fde-9a7d-6875a537887f" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201655 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6721ea95-4d66-4dc4-b502-a4e6be931279" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201661 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6721ea95-4d66-4dc4-b502-a4e6be931279" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: E1125 15:25:28.201673 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9637eb-b283-42b2-9089-b0bca2df1b8a" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.201681 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9637eb-b283-42b2-9089-b0bca2df1b8a" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202063 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="93df8a0d-1ef0-4598-8253-92c5bbeb95aa" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202098 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8db14bf-f245-4b59-a601-b9319b4b2e23" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202126 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57d631d-ea34-407d-be71-02c7bf7bc2e4" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202149 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1d2b24-a9a4-4fde-9a7d-6875a537887f" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202163 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea9637eb-b283-42b2-9089-b0bca2df1b8a" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202178 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ff1c96-a8a8-431e-ab9b-a76d3992463f" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202201 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2155fa2-3b94-4413-84de-171185b0d253" containerName="mariadb-account-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202217 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="6721ea95-4d66-4dc4-b502-a4e6be931279" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202231 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ed68a6-1a16-4e43-86a2-7901c1c7aa35" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202245 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c898509-daeb-4566-b5b1-39f4b742b5d0" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202268 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="61094cee-413d-4b91-ace8-69afdbaa6226" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.202506 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="656770af-f30e-490a-9987-71cc29c5e278" containerName="mariadb-database-create" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.203268 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.205110 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-cr787"] Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.206551 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.207146 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2xpv9" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.315756 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-db-sync-config-data\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.316130 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll9bz\" (UniqueName: \"kubernetes.io/projected/0546399d-f1ee-4fe8-aa16-fb64e9f58899-kube-api-access-ll9bz\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.316172 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-combined-ca-bundle\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.316258 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-config-data\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.417716 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-db-sync-config-data\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.417760 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll9bz\" (UniqueName: \"kubernetes.io/projected/0546399d-f1ee-4fe8-aa16-fb64e9f58899-kube-api-access-ll9bz\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.417800 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-combined-ca-bundle\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.417830 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-config-data\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.426545 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-db-sync-config-data\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.426600 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-config-data\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.437770 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-combined-ca-bundle\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.450666 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll9bz\" (UniqueName: \"kubernetes.io/projected/0546399d-f1ee-4fe8-aa16-fb64e9f58899-kube-api-access-ll9bz\") pod \"glance-db-sync-cr787\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.541984 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cr787" Nov 25 15:25:28 crc kubenswrapper[4965]: I1125 15:25:28.971483 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-cr787"] Nov 25 15:25:29 crc kubenswrapper[4965]: I1125 15:25:29.092138 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cr787" event={"ID":"0546399d-f1ee-4fe8-aa16-fb64e9f58899","Type":"ContainerStarted","Data":"2a1f7e9d4add1068920f47dcc3551e0dae6153a31a35080f71a2a2d709260ab9"} Nov 25 15:25:32 crc kubenswrapper[4965]: I1125 15:25:32.843669 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qx2h5"] Nov 25 15:25:32 crc kubenswrapper[4965]: I1125 15:25:32.847881 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:32 crc kubenswrapper[4965]: I1125 15:25:32.853149 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 15:25:32 crc kubenswrapper[4965]: I1125 15:25:32.859132 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v47cv" Nov 25 15:25:32 crc kubenswrapper[4965]: I1125 15:25:32.859147 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 15:25:32 crc kubenswrapper[4965]: I1125 15:25:32.859185 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 15:25:32 crc kubenswrapper[4965]: I1125 15:25:32.882527 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qx2h5"] Nov 25 15:25:32 crc kubenswrapper[4965]: I1125 15:25:32.994033 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-combined-ca-bundle\") pod \"keystone-db-sync-qx2h5\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:32 crc kubenswrapper[4965]: I1125 15:25:32.994179 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhcf2\" (UniqueName: \"kubernetes.io/projected/2fc256e5-efae-4f80-bcb9-c196c1223ac0-kube-api-access-lhcf2\") pod \"keystone-db-sync-qx2h5\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:32 crc kubenswrapper[4965]: I1125 15:25:32.994211 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-config-data\") pod \"keystone-db-sync-qx2h5\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:33 crc kubenswrapper[4965]: I1125 15:25:33.096752 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhcf2\" (UniqueName: \"kubernetes.io/projected/2fc256e5-efae-4f80-bcb9-c196c1223ac0-kube-api-access-lhcf2\") pod \"keystone-db-sync-qx2h5\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:33 crc kubenswrapper[4965]: I1125 15:25:33.096854 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-config-data\") pod \"keystone-db-sync-qx2h5\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:33 crc kubenswrapper[4965]: I1125 15:25:33.096919 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-combined-ca-bundle\") pod \"keystone-db-sync-qx2h5\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:33 crc kubenswrapper[4965]: I1125 15:25:33.103922 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-config-data\") pod \"keystone-db-sync-qx2h5\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:33 crc kubenswrapper[4965]: I1125 15:25:33.104777 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-combined-ca-bundle\") pod \"keystone-db-sync-qx2h5\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:33 crc kubenswrapper[4965]: I1125 15:25:33.114589 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhcf2\" (UniqueName: \"kubernetes.io/projected/2fc256e5-efae-4f80-bcb9-c196c1223ac0-kube-api-access-lhcf2\") pod \"keystone-db-sync-qx2h5\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:33 crc kubenswrapper[4965]: I1125 15:25:33.173693 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:33 crc kubenswrapper[4965]: I1125 15:25:33.611370 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qx2h5"] Nov 25 15:25:33 crc kubenswrapper[4965]: W1125 15:25:33.617871 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fc256e5_efae_4f80_bcb9_c196c1223ac0.slice/crio-a562ea35488b2d848d8a674707eb64aad88f74fb935cb38d148116cae70b0237 WatchSource:0}: Error finding container a562ea35488b2d848d8a674707eb64aad88f74fb935cb38d148116cae70b0237: Status 404 returned error can't find the container with id a562ea35488b2d848d8a674707eb64aad88f74fb935cb38d148116cae70b0237 Nov 25 15:25:34 crc kubenswrapper[4965]: I1125 15:25:34.138083 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qx2h5" event={"ID":"2fc256e5-efae-4f80-bcb9-c196c1223ac0","Type":"ContainerStarted","Data":"a562ea35488b2d848d8a674707eb64aad88f74fb935cb38d148116cae70b0237"} Nov 25 15:25:39 crc kubenswrapper[4965]: I1125 15:25:39.255046 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Nov 25 15:25:41 crc kubenswrapper[4965]: E1125 15:25:41.143682 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2065325081/1\": happened during read: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Nov 25 15:25:41 crc kubenswrapper[4965]: E1125 15:25:41.144262 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ll9bz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-cr787_openstack(0546399d-f1ee-4fe8-aa16-fb64e9f58899): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2065325081/1\": happened during read: context canceled" logger="UnhandledError" Nov 25 15:25:41 crc kubenswrapper[4965]: E1125 15:25:41.146117 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2065325081/1\\\": happened during read: context canceled\"" pod="openstack/glance-db-sync-cr787" podUID="0546399d-f1ee-4fe8-aa16-fb64e9f58899" Nov 25 15:25:41 crc kubenswrapper[4965]: E1125 15:25:41.231001 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-cr787" podUID="0546399d-f1ee-4fe8-aa16-fb64e9f58899" Nov 25 15:25:48 crc kubenswrapper[4965]: I1125 15:25:48.288619 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qx2h5" event={"ID":"2fc256e5-efae-4f80-bcb9-c196c1223ac0","Type":"ContainerStarted","Data":"a8437bc5dd4abecb7bed19b9f36ecad826757955e7637392785a475e2dfdeeb7"} Nov 25 15:25:48 crc kubenswrapper[4965]: I1125 15:25:48.307053 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qx2h5" podStartSLOduration=2.003700254 podStartE2EDuration="16.307030572s" podCreationTimestamp="2025-11-25 15:25:32 +0000 UTC" firstStartedPulling="2025-11-25 15:25:33.621307189 +0000 UTC m=+1278.588900935" lastFinishedPulling="2025-11-25 15:25:47.924637497 +0000 UTC m=+1292.892231253" observedRunningTime="2025-11-25 15:25:48.30440169 +0000 UTC m=+1293.271995436" watchObservedRunningTime="2025-11-25 15:25:48.307030572 +0000 UTC m=+1293.274624318" Nov 25 15:25:49 crc kubenswrapper[4965]: I1125 15:25:49.251423 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Nov 25 15:25:52 crc kubenswrapper[4965]: I1125 15:25:52.323048 4965 generic.go:334] "Generic (PLEG): container finished" podID="2fc256e5-efae-4f80-bcb9-c196c1223ac0" containerID="a8437bc5dd4abecb7bed19b9f36ecad826757955e7637392785a475e2dfdeeb7" exitCode=0 Nov 25 15:25:52 crc kubenswrapper[4965]: I1125 15:25:52.323165 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qx2h5" event={"ID":"2fc256e5-efae-4f80-bcb9-c196c1223ac0","Type":"ContainerDied","Data":"a8437bc5dd4abecb7bed19b9f36ecad826757955e7637392785a475e2dfdeeb7"} Nov 25 15:25:53 crc kubenswrapper[4965]: I1125 15:25:53.624457 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:53 crc kubenswrapper[4965]: I1125 15:25:53.744285 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhcf2\" (UniqueName: \"kubernetes.io/projected/2fc256e5-efae-4f80-bcb9-c196c1223ac0-kube-api-access-lhcf2\") pod \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " Nov 25 15:25:53 crc kubenswrapper[4965]: I1125 15:25:53.744361 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-config-data\") pod \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " Nov 25 15:25:53 crc kubenswrapper[4965]: I1125 15:25:53.744450 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-combined-ca-bundle\") pod \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\" (UID: \"2fc256e5-efae-4f80-bcb9-c196c1223ac0\") " Nov 25 15:25:53 crc kubenswrapper[4965]: I1125 15:25:53.767718 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fc256e5-efae-4f80-bcb9-c196c1223ac0-kube-api-access-lhcf2" (OuterVolumeSpecName: "kube-api-access-lhcf2") pod "2fc256e5-efae-4f80-bcb9-c196c1223ac0" (UID: "2fc256e5-efae-4f80-bcb9-c196c1223ac0"). InnerVolumeSpecName "kube-api-access-lhcf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:53 crc kubenswrapper[4965]: I1125 15:25:53.783916 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fc256e5-efae-4f80-bcb9-c196c1223ac0" (UID: "2fc256e5-efae-4f80-bcb9-c196c1223ac0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:25:53 crc kubenswrapper[4965]: I1125 15:25:53.800895 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-config-data" (OuterVolumeSpecName: "config-data") pod "2fc256e5-efae-4f80-bcb9-c196c1223ac0" (UID: "2fc256e5-efae-4f80-bcb9-c196c1223ac0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:25:53 crc kubenswrapper[4965]: I1125 15:25:53.845865 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:53 crc kubenswrapper[4965]: I1125 15:25:53.845904 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhcf2\" (UniqueName: \"kubernetes.io/projected/2fc256e5-efae-4f80-bcb9-c196c1223ac0-kube-api-access-lhcf2\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:53 crc kubenswrapper[4965]: I1125 15:25:53.845920 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc256e5-efae-4f80-bcb9-c196c1223ac0-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.341407 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qx2h5" event={"ID":"2fc256e5-efae-4f80-bcb9-c196c1223ac0","Type":"ContainerDied","Data":"a562ea35488b2d848d8a674707eb64aad88f74fb935cb38d148116cae70b0237"} Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.341452 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a562ea35488b2d848d8a674707eb64aad88f74fb935cb38d148116cae70b0237" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.341515 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qx2h5" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.680938 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2x65x"] Nov 25 15:25:54 crc kubenswrapper[4965]: E1125 15:25:54.681325 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc256e5-efae-4f80-bcb9-c196c1223ac0" containerName="keystone-db-sync" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.681338 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc256e5-efae-4f80-bcb9-c196c1223ac0" containerName="keystone-db-sync" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.681532 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc256e5-efae-4f80-bcb9-c196c1223ac0" containerName="keystone-db-sync" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.682698 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.685702 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.686320 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.686468 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.686450 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v47cv" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.692623 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.735060 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-5sdll"] Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.737213 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.810563 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-5sdll"] Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.810617 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2x65x"] Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864070 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-scripts\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864195 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnrsn\" (UniqueName: \"kubernetes.io/projected/2d1929ee-4d39-46a6-bdd5-684137c6844e-kube-api-access-pnrsn\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864249 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-credential-keys\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864275 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864356 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864388 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-fernet-keys\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864410 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864439 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-config-data\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864474 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-combined-ca-bundle\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864496 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-config\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.864518 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2b82\" (UniqueName: \"kubernetes.io/projected/cb05fd28-1065-4beb-af6a-a24b1848f2b3-kube-api-access-j2b82\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.968576 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnrsn\" (UniqueName: \"kubernetes.io/projected/2d1929ee-4d39-46a6-bdd5-684137c6844e-kube-api-access-pnrsn\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.969015 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-credential-keys\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.969037 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.969088 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.969111 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-fernet-keys\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.969131 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.969151 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-config-data\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.969180 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-combined-ca-bundle\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.969196 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-config\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.969221 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2b82\" (UniqueName: \"kubernetes.io/projected/cb05fd28-1065-4beb-af6a-a24b1848f2b3-kube-api-access-j2b82\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.969252 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-scripts\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.971174 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.971475 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.971505 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-config\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.971534 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.968740 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.977323 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-credential-keys\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.977600 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-fernet-keys\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.979351 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-combined-ca-bundle\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.984841 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-config-data\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.985025 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:25:54 crc kubenswrapper[4965]: I1125 15:25:54.988290 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-scripts\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.022206 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.022714 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.037711 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2b82\" (UniqueName: \"kubernetes.io/projected/cb05fd28-1065-4beb-af6a-a24b1848f2b3-kube-api-access-j2b82\") pod \"keystone-bootstrap-2x65x\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.039858 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnrsn\" (UniqueName: \"kubernetes.io/projected/2d1929ee-4d39-46a6-bdd5-684137c6844e-kube-api-access-pnrsn\") pod \"dnsmasq-dns-75bb4695fc-5sdll\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.065110 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.071260 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-run-httpd\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.071719 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df8f2\" (UniqueName: \"kubernetes.io/projected/fbe835dc-8361-426b-931b-96a5f52d8743-kube-api-access-df8f2\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.071840 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.071925 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-config-data\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.072036 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-scripts\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.072166 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.072283 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-log-httpd\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.071663 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.086945 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wbvvf"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.088517 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.091990 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zghfs" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.092333 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wbvvf"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.096225 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.100212 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.182252 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-run-httpd\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.182308 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df8f2\" (UniqueName: \"kubernetes.io/projected/fbe835dc-8361-426b-931b-96a5f52d8743-kube-api-access-df8f2\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.182363 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.182395 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-config-data\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.182444 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-scripts\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.182469 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-combined-ca-bundle\") pod \"neutron-db-sync-wbvvf\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.182537 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-config\") pod \"neutron-db-sync-wbvvf\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.182561 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs7qz\" (UniqueName: \"kubernetes.io/projected/c9d6c94f-3d83-4521-ac18-3eda81279450-kube-api-access-cs7qz\") pod \"neutron-db-sync-wbvvf\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.182606 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.182675 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-log-httpd\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.183942 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-run-httpd\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.185951 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-log-httpd\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.191862 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-scripts\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.192699 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.197861 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.200521 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-config-data\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.216826 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df8f2\" (UniqueName: \"kubernetes.io/projected/fbe835dc-8361-426b-931b-96a5f52d8743-kube-api-access-df8f2\") pod \"ceilometer-0\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.217046 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-5sdll"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.244538 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-lmtqd"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.246050 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.251602 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.252092 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-scrvm" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.252336 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.267813 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lmtqd"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.284910 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-config\") pod \"neutron-db-sync-wbvvf\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.284953 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs7qz\" (UniqueName: \"kubernetes.io/projected/c9d6c94f-3d83-4521-ac18-3eda81279450-kube-api-access-cs7qz\") pod \"neutron-db-sync-wbvvf\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.285207 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-combined-ca-bundle\") pod \"neutron-db-sync-wbvvf\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.296958 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-combined-ca-bundle\") pod \"neutron-db-sync-wbvvf\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.299555 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-config\") pod \"neutron-db-sync-wbvvf\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.305462 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.314650 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-gpzgb"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.316228 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.324901 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xfd5f"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.326771 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.331010 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.331202 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tj7pv" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.331270 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.343038 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs7qz\" (UniqueName: \"kubernetes.io/projected/c9d6c94f-3d83-4521-ac18-3eda81279450-kube-api-access-cs7qz\") pod \"neutron-db-sync-wbvvf\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.367620 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gpzgb"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.388905 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-scripts\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.389398 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-db-sync-config-data\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.389418 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-combined-ca-bundle\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.389492 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-config-data\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.389570 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kdc4\" (UniqueName: \"kubernetes.io/projected/78768f1b-d9d5-4124-8fe5-bc4b357605ca-kube-api-access-4kdc4\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.389668 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78768f1b-d9d5-4124-8fe5-bc4b357605ca-etc-machine-id\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.417716 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rg5sn"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.420475 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.423199 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.423508 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gsxq7" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.451208 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xfd5f"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.474035 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rg5sn"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.486248 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.490837 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-scripts\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.490888 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx4dh\" (UniqueName: \"kubernetes.io/projected/ff456420-44c5-4945-925c-2f36ae44aad3-kube-api-access-jx4dh\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.490941 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-db-sync-config-data\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.490981 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-combined-ca-bundle\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491023 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491073 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff456420-44c5-4945-925c-2f36ae44aad3-logs\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491094 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-config-data\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491128 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-config-data\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491160 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-scripts\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491201 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kdc4\" (UniqueName: \"kubernetes.io/projected/78768f1b-d9d5-4124-8fe5-bc4b357605ca-kube-api-access-4kdc4\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491223 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-combined-ca-bundle\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491258 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-config\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491297 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491335 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491370 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddcf\" (UniqueName: \"kubernetes.io/projected/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-kube-api-access-xddcf\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491396 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78768f1b-d9d5-4124-8fe5-bc4b357605ca-etc-machine-id\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.491510 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78768f1b-d9d5-4124-8fe5-bc4b357605ca-etc-machine-id\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.503185 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-combined-ca-bundle\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.506332 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-db-sync-config-data\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.518696 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.524575 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-config-data\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.525036 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kdc4\" (UniqueName: \"kubernetes.io/projected/78768f1b-d9d5-4124-8fe5-bc4b357605ca-kube-api-access-4kdc4\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.528762 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-scripts\") pod \"cinder-db-sync-lmtqd\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597513 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx4dh\" (UniqueName: \"kubernetes.io/projected/ff456420-44c5-4945-925c-2f36ae44aad3-kube-api-access-jx4dh\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597589 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m57h\" (UniqueName: \"kubernetes.io/projected/00430c42-5b8c-45e7-97d3-c4c256468678-kube-api-access-7m57h\") pod \"barbican-db-sync-rg5sn\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597630 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-combined-ca-bundle\") pod \"barbican-db-sync-rg5sn\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597684 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597727 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-db-sync-config-data\") pod \"barbican-db-sync-rg5sn\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597780 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff456420-44c5-4945-925c-2f36ae44aad3-logs\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597800 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-config-data\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597854 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-scripts\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597895 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-combined-ca-bundle\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597924 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-config\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.597959 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.598020 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.598052 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddcf\" (UniqueName: \"kubernetes.io/projected/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-kube-api-access-xddcf\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.599112 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-config\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.599348 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.600537 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.601129 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.601408 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff456420-44c5-4945-925c-2f36ae44aad3-logs\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.601812 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.606958 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-combined-ca-bundle\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.615800 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-scripts\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.624625 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-config-data\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.651459 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-5sdll"] Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.656463 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddcf\" (UniqueName: \"kubernetes.io/projected/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-kube-api-access-xddcf\") pod \"dnsmasq-dns-745b9ddc8c-xfd5f\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.657262 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx4dh\" (UniqueName: \"kubernetes.io/projected/ff456420-44c5-4945-925c-2f36ae44aad3-kube-api-access-jx4dh\") pod \"placement-db-sync-gpzgb\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.661370 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gpzgb" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.705387 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m57h\" (UniqueName: \"kubernetes.io/projected/00430c42-5b8c-45e7-97d3-c4c256468678-kube-api-access-7m57h\") pod \"barbican-db-sync-rg5sn\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.705432 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-combined-ca-bundle\") pod \"barbican-db-sync-rg5sn\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.705475 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-db-sync-config-data\") pod \"barbican-db-sync-rg5sn\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.721599 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.726086 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-combined-ca-bundle\") pod \"barbican-db-sync-rg5sn\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.744537 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-db-sync-config-data\") pod \"barbican-db-sync-rg5sn\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.756526 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m57h\" (UniqueName: \"kubernetes.io/projected/00430c42-5b8c-45e7-97d3-c4c256468678-kube-api-access-7m57h\") pod \"barbican-db-sync-rg5sn\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:55 crc kubenswrapper[4965]: I1125 15:25:55.774153 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:25:56 crc kubenswrapper[4965]: I1125 15:25:56.079561 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2x65x"] Nov 25 15:25:56 crc kubenswrapper[4965]: I1125 15:25:56.400734 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:25:56 crc kubenswrapper[4965]: I1125 15:25:56.427066 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" event={"ID":"2d1929ee-4d39-46a6-bdd5-684137c6844e","Type":"ContainerStarted","Data":"13e11a4aa4d4618d2886afcf9374a6c3388d81d3e2f10d388d185447b705a760"} Nov 25 15:25:56 crc kubenswrapper[4965]: I1125 15:25:56.434524 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2x65x" event={"ID":"cb05fd28-1065-4beb-af6a-a24b1848f2b3","Type":"ContainerStarted","Data":"5d10637f2ff2c872a60d85dd60e4154d956d9ff07bee57afd2e796c84e4d9a60"} Nov 25 15:25:56 crc kubenswrapper[4965]: I1125 15:25:56.454447 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe835dc-8361-426b-931b-96a5f52d8743","Type":"ContainerStarted","Data":"24aa86de256f738185dbb129671078de0b213dc0b5e07176688cc0f1f1f707b5"} Nov 25 15:25:56 crc kubenswrapper[4965]: I1125 15:25:56.608618 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gpzgb"] Nov 25 15:25:56 crc kubenswrapper[4965]: W1125 15:25:56.609740 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78768f1b_d9d5_4124_8fe5_bc4b357605ca.slice/crio-c21fb1973289bc2c6c344e7b6d96181c76c7486ea9b3b91c629d41271e968439 WatchSource:0}: Error finding container c21fb1973289bc2c6c344e7b6d96181c76c7486ea9b3b91c629d41271e968439: Status 404 returned error can't find the container with id c21fb1973289bc2c6c344e7b6d96181c76c7486ea9b3b91c629d41271e968439 Nov 25 15:25:56 crc kubenswrapper[4965]: I1125 15:25:56.621358 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lmtqd"] Nov 25 15:25:56 crc kubenswrapper[4965]: I1125 15:25:56.640775 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wbvvf"] Nov 25 15:25:56 crc kubenswrapper[4965]: I1125 15:25:56.723678 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rg5sn"] Nov 25 15:25:56 crc kubenswrapper[4965]: W1125 15:25:56.801626 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2582ae0_ed83_440f_a1e0_2e9e65d9a005.slice/crio-656a2c8543d7606ed356ab139c371a90f6960680e2b5f43b4de655adc3d426c9 WatchSource:0}: Error finding container 656a2c8543d7606ed356ab139c371a90f6960680e2b5f43b4de655adc3d426c9: Status 404 returned error can't find the container with id 656a2c8543d7606ed356ab139c371a90f6960680e2b5f43b4de655adc3d426c9 Nov 25 15:25:56 crc kubenswrapper[4965]: I1125 15:25:56.805914 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xfd5f"] Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.134307 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.508585 4965 generic.go:334] "Generic (PLEG): container finished" podID="a2582ae0-ed83-440f-a1e0-2e9e65d9a005" containerID="d25930367d8c8f1b1a77995eb25797cb8e5443026ddbf15ecc5abcc611111475" exitCode=0 Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.508902 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" event={"ID":"a2582ae0-ed83-440f-a1e0-2e9e65d9a005","Type":"ContainerDied","Data":"d25930367d8c8f1b1a77995eb25797cb8e5443026ddbf15ecc5abcc611111475"} Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.508928 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" event={"ID":"a2582ae0-ed83-440f-a1e0-2e9e65d9a005","Type":"ContainerStarted","Data":"656a2c8543d7606ed356ab139c371a90f6960680e2b5f43b4de655adc3d426c9"} Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.518463 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2x65x" event={"ID":"cb05fd28-1065-4beb-af6a-a24b1848f2b3","Type":"ContainerStarted","Data":"5d94320ba1d71d2e918a3a347c3890dba983d183d197427156eb60aa5b159774"} Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.525660 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wbvvf" event={"ID":"c9d6c94f-3d83-4521-ac18-3eda81279450","Type":"ContainerStarted","Data":"d7a5f784131138dab7cfff054577351056b9ac1c10c088ec78d4673b33c01d4a"} Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.525765 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wbvvf" event={"ID":"c9d6c94f-3d83-4521-ac18-3eda81279450","Type":"ContainerStarted","Data":"9779204ed5bb453142d2cd5f8f8a61eb5ee52a68b454325e10fd24078365e0d5"} Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.557251 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rg5sn" event={"ID":"00430c42-5b8c-45e7-97d3-c4c256468678","Type":"ContainerStarted","Data":"a9f8a43c523d05fead483eec2e02da8a9c6f63a0adb4b4a013332838bb29cf54"} Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.574229 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lmtqd" event={"ID":"78768f1b-d9d5-4124-8fe5-bc4b357605ca","Type":"ContainerStarted","Data":"c21fb1973289bc2c6c344e7b6d96181c76c7486ea9b3b91c629d41271e968439"} Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.577057 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2x65x" podStartSLOduration=3.5770316060000003 podStartE2EDuration="3.577031606s" podCreationTimestamp="2025-11-25 15:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:25:57.563521288 +0000 UTC m=+1302.531115034" watchObservedRunningTime="2025-11-25 15:25:57.577031606 +0000 UTC m=+1302.544625352" Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.586378 4965 generic.go:334] "Generic (PLEG): container finished" podID="2d1929ee-4d39-46a6-bdd5-684137c6844e" containerID="0fc399e328560f523fcaecb5d11f16782fa8cc9b9cab42549f9b13e1506b6063" exitCode=0 Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.586672 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" event={"ID":"2d1929ee-4d39-46a6-bdd5-684137c6844e","Type":"ContainerDied","Data":"0fc399e328560f523fcaecb5d11f16782fa8cc9b9cab42549f9b13e1506b6063"} Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.596765 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wbvvf" podStartSLOduration=2.596742602 podStartE2EDuration="2.596742602s" podCreationTimestamp="2025-11-25 15:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:25:57.58710212 +0000 UTC m=+1302.554695866" watchObservedRunningTime="2025-11-25 15:25:57.596742602 +0000 UTC m=+1302.564336348" Nov 25 15:25:57 crc kubenswrapper[4965]: I1125 15:25:57.600536 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gpzgb" event={"ID":"ff456420-44c5-4945-925c-2f36ae44aad3","Type":"ContainerStarted","Data":"dbbd4bd8b2c5fbdf4c52b2018f31bda79cc7291b9c59513c406d545b25802581"} Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.114340 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.229177 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnrsn\" (UniqueName: \"kubernetes.io/projected/2d1929ee-4d39-46a6-bdd5-684137c6844e-kube-api-access-pnrsn\") pod \"2d1929ee-4d39-46a6-bdd5-684137c6844e\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.229270 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-sb\") pod \"2d1929ee-4d39-46a6-bdd5-684137c6844e\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.229307 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-config\") pod \"2d1929ee-4d39-46a6-bdd5-684137c6844e\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.229478 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-nb\") pod \"2d1929ee-4d39-46a6-bdd5-684137c6844e\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.232144 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-dns-svc\") pod \"2d1929ee-4d39-46a6-bdd5-684137c6844e\" (UID: \"2d1929ee-4d39-46a6-bdd5-684137c6844e\") " Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.235981 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1929ee-4d39-46a6-bdd5-684137c6844e-kube-api-access-pnrsn" (OuterVolumeSpecName: "kube-api-access-pnrsn") pod "2d1929ee-4d39-46a6-bdd5-684137c6844e" (UID: "2d1929ee-4d39-46a6-bdd5-684137c6844e"). InnerVolumeSpecName "kube-api-access-pnrsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.299620 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-config" (OuterVolumeSpecName: "config") pod "2d1929ee-4d39-46a6-bdd5-684137c6844e" (UID: "2d1929ee-4d39-46a6-bdd5-684137c6844e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.300172 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d1929ee-4d39-46a6-bdd5-684137c6844e" (UID: "2d1929ee-4d39-46a6-bdd5-684137c6844e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.300409 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d1929ee-4d39-46a6-bdd5-684137c6844e" (UID: "2d1929ee-4d39-46a6-bdd5-684137c6844e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.335180 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.335217 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnrsn\" (UniqueName: \"kubernetes.io/projected/2d1929ee-4d39-46a6-bdd5-684137c6844e-kube-api-access-pnrsn\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.335230 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.335238 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.477120 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d1929ee-4d39-46a6-bdd5-684137c6844e" (UID: "2d1929ee-4d39-46a6-bdd5-684137c6844e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.538908 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d1929ee-4d39-46a6-bdd5-684137c6844e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.652150 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" event={"ID":"2d1929ee-4d39-46a6-bdd5-684137c6844e","Type":"ContainerDied","Data":"13e11a4aa4d4618d2886afcf9374a6c3388d81d3e2f10d388d185447b705a760"} Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.652530 4965 scope.go:117] "RemoveContainer" containerID="0fc399e328560f523fcaecb5d11f16782fa8cc9b9cab42549f9b13e1506b6063" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.652719 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-5sdll" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.667280 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" event={"ID":"a2582ae0-ed83-440f-a1e0-2e9e65d9a005","Type":"ContainerStarted","Data":"1cc687b285ecf780f7cff178a9ca5f5aebe6d6286b6e609dd51939879af1c139"} Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.697521 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" podStartSLOduration=3.697493383 podStartE2EDuration="3.697493383s" podCreationTimestamp="2025-11-25 15:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:25:58.697287898 +0000 UTC m=+1303.664881634" watchObservedRunningTime="2025-11-25 15:25:58.697493383 +0000 UTC m=+1303.665087139" Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.788050 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-5sdll"] Nov 25 15:25:58 crc kubenswrapper[4965]: I1125 15:25:58.827998 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-5sdll"] Nov 25 15:25:59 crc kubenswrapper[4965]: I1125 15:25:59.253208 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:25:59 crc kubenswrapper[4965]: I1125 15:25:59.701028 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:26:00 crc kubenswrapper[4965]: I1125 15:26:00.786577 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1929ee-4d39-46a6-bdd5-684137c6844e" path="/var/lib/kubelet/pods/2d1929ee-4d39-46a6-bdd5-684137c6844e/volumes" Nov 25 15:26:03 crc kubenswrapper[4965]: I1125 15:26:03.748091 4965 generic.go:334] "Generic (PLEG): container finished" podID="cb05fd28-1065-4beb-af6a-a24b1848f2b3" containerID="5d94320ba1d71d2e918a3a347c3890dba983d183d197427156eb60aa5b159774" exitCode=0 Nov 25 15:26:03 crc kubenswrapper[4965]: I1125 15:26:03.748164 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2x65x" event={"ID":"cb05fd28-1065-4beb-af6a-a24b1848f2b3","Type":"ContainerDied","Data":"5d94320ba1d71d2e918a3a347c3890dba983d183d197427156eb60aa5b159774"} Nov 25 15:26:05 crc kubenswrapper[4965]: I1125 15:26:05.724396 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:26:05 crc kubenswrapper[4965]: I1125 15:26:05.793457 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bn4gq"] Nov 25 15:26:05 crc kubenswrapper[4965]: I1125 15:26:05.793729 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="dnsmasq-dns" containerID="cri-o://de9a870f43d29aee7d0caf5f7191254e3b7ea5f9b5cb6402c1f1aeec9baf67ef" gracePeriod=10 Nov 25 15:26:06 crc kubenswrapper[4965]: I1125 15:26:06.808931 4965 generic.go:334] "Generic (PLEG): container finished" podID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerID="de9a870f43d29aee7d0caf5f7191254e3b7ea5f9b5cb6402c1f1aeec9baf67ef" exitCode=0 Nov 25 15:26:06 crc kubenswrapper[4965]: I1125 15:26:06.809014 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" event={"ID":"af526b85-9598-43f5-90d5-bdc6f1a0eb1f","Type":"ContainerDied","Data":"de9a870f43d29aee7d0caf5f7191254e3b7ea5f9b5cb6402c1f1aeec9baf67ef"} Nov 25 15:26:08 crc kubenswrapper[4965]: I1125 15:26:08.829080 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Nov 25 15:26:17 crc kubenswrapper[4965]: E1125 15:26:17.320646 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Nov 25 15:26:17 crc kubenswrapper[4965]: E1125 15:26:17.321511 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ll9bz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-cr787_openstack(0546399d-f1ee-4fe8-aa16-fb64e9f58899): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:26:17 crc kubenswrapper[4965]: E1125 15:26:17.324124 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-cr787" podUID="0546399d-f1ee-4fe8-aa16-fb64e9f58899" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.381437 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.436505 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-config-data\") pod \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.436574 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-fernet-keys\") pod \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.436655 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-combined-ca-bundle\") pod \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.436675 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-credential-keys\") pod \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.436807 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-scripts\") pod \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.436841 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2b82\" (UniqueName: \"kubernetes.io/projected/cb05fd28-1065-4beb-af6a-a24b1848f2b3-kube-api-access-j2b82\") pod \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\" (UID: \"cb05fd28-1065-4beb-af6a-a24b1848f2b3\") " Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.448215 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cb05fd28-1065-4beb-af6a-a24b1848f2b3" (UID: "cb05fd28-1065-4beb-af6a-a24b1848f2b3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.451384 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb05fd28-1065-4beb-af6a-a24b1848f2b3-kube-api-access-j2b82" (OuterVolumeSpecName: "kube-api-access-j2b82") pod "cb05fd28-1065-4beb-af6a-a24b1848f2b3" (UID: "cb05fd28-1065-4beb-af6a-a24b1848f2b3"). InnerVolumeSpecName "kube-api-access-j2b82". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.454270 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-scripts" (OuterVolumeSpecName: "scripts") pod "cb05fd28-1065-4beb-af6a-a24b1848f2b3" (UID: "cb05fd28-1065-4beb-af6a-a24b1848f2b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.454727 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cb05fd28-1065-4beb-af6a-a24b1848f2b3" (UID: "cb05fd28-1065-4beb-af6a-a24b1848f2b3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.468946 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-config-data" (OuterVolumeSpecName: "config-data") pod "cb05fd28-1065-4beb-af6a-a24b1848f2b3" (UID: "cb05fd28-1065-4beb-af6a-a24b1848f2b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.473196 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb05fd28-1065-4beb-af6a-a24b1848f2b3" (UID: "cb05fd28-1065-4beb-af6a-a24b1848f2b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.538639 4965 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.538672 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.538689 4965 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.538703 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.538713 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2b82\" (UniqueName: \"kubernetes.io/projected/cb05fd28-1065-4beb-af6a-a24b1848f2b3-kube-api-access-j2b82\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.538725 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb05fd28-1065-4beb-af6a-a24b1848f2b3-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.908242 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2x65x" event={"ID":"cb05fd28-1065-4beb-af6a-a24b1848f2b3","Type":"ContainerDied","Data":"5d10637f2ff2c872a60d85dd60e4154d956d9ff07bee57afd2e796c84e4d9a60"} Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.908546 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d10637f2ff2c872a60d85dd60e4154d956d9ff07bee57afd2e796c84e4d9a60" Nov 25 15:26:17 crc kubenswrapper[4965]: I1125 15:26:17.908282 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2x65x" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.473477 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2x65x"] Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.480574 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2x65x"] Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.580817 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sj59d"] Nov 25 15:26:18 crc kubenswrapper[4965]: E1125 15:26:18.581304 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1929ee-4d39-46a6-bdd5-684137c6844e" containerName="init" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.581321 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1929ee-4d39-46a6-bdd5-684137c6844e" containerName="init" Nov 25 15:26:18 crc kubenswrapper[4965]: E1125 15:26:18.581338 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb05fd28-1065-4beb-af6a-a24b1848f2b3" containerName="keystone-bootstrap" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.581346 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb05fd28-1065-4beb-af6a-a24b1848f2b3" containerName="keystone-bootstrap" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.581607 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb05fd28-1065-4beb-af6a-a24b1848f2b3" containerName="keystone-bootstrap" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.581633 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1929ee-4d39-46a6-bdd5-684137c6844e" containerName="init" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.582338 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.584061 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.584139 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v47cv" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.584483 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.585260 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.585655 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.591366 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sj59d"] Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.658845 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-fernet-keys\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.658897 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dkh6\" (UniqueName: \"kubernetes.io/projected/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-kube-api-access-5dkh6\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.658960 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-config-data\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.659117 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-combined-ca-bundle\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.659170 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-scripts\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.659288 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-credential-keys\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.760495 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-scripts\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.760558 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-credential-keys\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.760597 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-fernet-keys\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.760622 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkh6\" (UniqueName: \"kubernetes.io/projected/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-kube-api-access-5dkh6\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.760665 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-config-data\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.760718 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-combined-ca-bundle\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.766321 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-combined-ca-bundle\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.767411 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-config-data\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.769852 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-credential-keys\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.769946 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-scripts\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.771936 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-fernet-keys\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.778635 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dkh6\" (UniqueName: \"kubernetes.io/projected/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-kube-api-access-5dkh6\") pod \"keystone-bootstrap-sj59d\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.784342 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb05fd28-1065-4beb-af6a-a24b1848f2b3" path="/var/lib/kubelet/pods/cb05fd28-1065-4beb-af6a-a24b1848f2b3/volumes" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.829299 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Nov 25 15:26:18 crc kubenswrapper[4965]: I1125 15:26:18.915343 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:19 crc kubenswrapper[4965]: I1125 15:26:19.927200 4965 generic.go:334] "Generic (PLEG): container finished" podID="c9d6c94f-3d83-4521-ac18-3eda81279450" containerID="d7a5f784131138dab7cfff054577351056b9ac1c10c088ec78d4673b33c01d4a" exitCode=0 Nov 25 15:26:19 crc kubenswrapper[4965]: I1125 15:26:19.927291 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wbvvf" event={"ID":"c9d6c94f-3d83-4521-ac18-3eda81279450","Type":"ContainerDied","Data":"d7a5f784131138dab7cfff054577351056b9ac1c10c088ec78d4673b33c01d4a"} Nov 25 15:26:23 crc kubenswrapper[4965]: I1125 15:26:23.260475 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:26:23 crc kubenswrapper[4965]: I1125 15:26:23.260937 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:26:23 crc kubenswrapper[4965]: I1125 15:26:23.830557 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Nov 25 15:26:23 crc kubenswrapper[4965]: I1125 15:26:23.831171 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:26:28 crc kubenswrapper[4965]: I1125 15:26:28.830795 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Nov 25 15:26:29 crc kubenswrapper[4965]: E1125 15:26:29.776344 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-cr787" podUID="0546399d-f1ee-4fe8-aa16-fb64e9f58899" Nov 25 15:26:30 crc kubenswrapper[4965]: I1125 15:26:30.997413 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.005776 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.049793 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wbvvf" event={"ID":"c9d6c94f-3d83-4521-ac18-3eda81279450","Type":"ContainerDied","Data":"9779204ed5bb453142d2cd5f8f8a61eb5ee52a68b454325e10fd24078365e0d5"} Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.050079 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9779204ed5bb453142d2cd5f8f8a61eb5ee52a68b454325e10fd24078365e0d5" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.050298 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wbvvf" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.063110 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" event={"ID":"af526b85-9598-43f5-90d5-bdc6f1a0eb1f","Type":"ContainerDied","Data":"2da553979127205917b4d62ac9a5b200e68a20e616c820e4c65ce375f6fd4e55"} Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.063177 4965 scope.go:117] "RemoveContainer" containerID="de9a870f43d29aee7d0caf5f7191254e3b7ea5f9b5cb6402c1f1aeec9baf67ef" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.063394 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.127137 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs7qz\" (UniqueName: \"kubernetes.io/projected/c9d6c94f-3d83-4521-ac18-3eda81279450-kube-api-access-cs7qz\") pod \"c9d6c94f-3d83-4521-ac18-3eda81279450\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.128054 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-config\") pod \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.128127 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-sb\") pod \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.128149 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7bz7\" (UniqueName: \"kubernetes.io/projected/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-kube-api-access-z7bz7\") pod \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.128194 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-dns-svc\") pod \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.128251 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-config\") pod \"c9d6c94f-3d83-4521-ac18-3eda81279450\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.128293 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-nb\") pod \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\" (UID: \"af526b85-9598-43f5-90d5-bdc6f1a0eb1f\") " Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.128337 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-combined-ca-bundle\") pod \"c9d6c94f-3d83-4521-ac18-3eda81279450\" (UID: \"c9d6c94f-3d83-4521-ac18-3eda81279450\") " Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.133281 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d6c94f-3d83-4521-ac18-3eda81279450-kube-api-access-cs7qz" (OuterVolumeSpecName: "kube-api-access-cs7qz") pod "c9d6c94f-3d83-4521-ac18-3eda81279450" (UID: "c9d6c94f-3d83-4521-ac18-3eda81279450"). InnerVolumeSpecName "kube-api-access-cs7qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.133350 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-kube-api-access-z7bz7" (OuterVolumeSpecName: "kube-api-access-z7bz7") pod "af526b85-9598-43f5-90d5-bdc6f1a0eb1f" (UID: "af526b85-9598-43f5-90d5-bdc6f1a0eb1f"). InnerVolumeSpecName "kube-api-access-z7bz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.161073 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-config" (OuterVolumeSpecName: "config") pod "c9d6c94f-3d83-4521-ac18-3eda81279450" (UID: "c9d6c94f-3d83-4521-ac18-3eda81279450"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.161666 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9d6c94f-3d83-4521-ac18-3eda81279450" (UID: "c9d6c94f-3d83-4521-ac18-3eda81279450"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.174173 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af526b85-9598-43f5-90d5-bdc6f1a0eb1f" (UID: "af526b85-9598-43f5-90d5-bdc6f1a0eb1f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.185387 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af526b85-9598-43f5-90d5-bdc6f1a0eb1f" (UID: "af526b85-9598-43f5-90d5-bdc6f1a0eb1f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.188929 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-config" (OuterVolumeSpecName: "config") pod "af526b85-9598-43f5-90d5-bdc6f1a0eb1f" (UID: "af526b85-9598-43f5-90d5-bdc6f1a0eb1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.198329 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af526b85-9598-43f5-90d5-bdc6f1a0eb1f" (UID: "af526b85-9598-43f5-90d5-bdc6f1a0eb1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.230385 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs7qz\" (UniqueName: \"kubernetes.io/projected/c9d6c94f-3d83-4521-ac18-3eda81279450-kube-api-access-cs7qz\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.230431 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.230444 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.230456 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7bz7\" (UniqueName: \"kubernetes.io/projected/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-kube-api-access-z7bz7\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.230468 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.230480 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.230491 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af526b85-9598-43f5-90d5-bdc6f1a0eb1f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.230502 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d6c94f-3d83-4521-ac18-3eda81279450-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.397830 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bn4gq"] Nov 25 15:26:31 crc kubenswrapper[4965]: I1125 15:26:31.404643 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bn4gq"] Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.130519 4965 scope.go:117] "RemoveContainer" containerID="c42488b332d0384f990165aadd0b22d646ddf4a5c4b5bf6584158680c709e42b" Nov 25 15:26:32 crc kubenswrapper[4965]: E1125 15:26:32.153565 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 25 15:26:32 crc kubenswrapper[4965]: E1125 15:26:32.153817 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4kdc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-lmtqd_openstack(78768f1b-d9d5-4124-8fe5-bc4b357605ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 15:26:32 crc kubenswrapper[4965]: E1125 15:26:32.155066 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-lmtqd" podUID="78768f1b-d9d5-4124-8fe5-bc4b357605ca" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.341504 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bb44dcd7c-j7d8l"] Nov 25 15:26:32 crc kubenswrapper[4965]: E1125 15:26:32.342192 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="init" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.342232 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="init" Nov 25 15:26:32 crc kubenswrapper[4965]: E1125 15:26:32.342256 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d6c94f-3d83-4521-ac18-3eda81279450" containerName="neutron-db-sync" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.342263 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d6c94f-3d83-4521-ac18-3eda81279450" containerName="neutron-db-sync" Nov 25 15:26:32 crc kubenswrapper[4965]: E1125 15:26:32.342276 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="dnsmasq-dns" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.342282 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="dnsmasq-dns" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.342570 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="dnsmasq-dns" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.342591 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d6c94f-3d83-4521-ac18-3eda81279450" containerName="neutron-db-sync" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.352979 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.370766 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb44dcd7c-j7d8l"] Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.455238 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-config\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.455294 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-nb\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.455363 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-sb\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.455396 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgprz\" (UniqueName: \"kubernetes.io/projected/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-kube-api-access-jgprz\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.455487 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-dns-svc\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.540345 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5584c797bd-g5v4b"] Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.542138 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.545858 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.546788 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.546907 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zghfs" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.547330 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.555960 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5584c797bd-g5v4b"] Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.557253 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-dns-svc\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.557489 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-config\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.557576 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-nb\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.557672 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-sb\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.557756 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgprz\" (UniqueName: \"kubernetes.io/projected/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-kube-api-access-jgprz\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.558400 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-dns-svc\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.558754 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-nb\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.558990 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-sb\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.559058 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-config\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.596917 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgprz\" (UniqueName: \"kubernetes.io/projected/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-kube-api-access-jgprz\") pod \"dnsmasq-dns-bb44dcd7c-j7d8l\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.659567 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m2fl\" (UniqueName: \"kubernetes.io/projected/67c635f8-b3c2-49fe-b8f3-110550f9e86d-kube-api-access-2m2fl\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.659707 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-ovndb-tls-certs\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.659746 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-config\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.659776 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-combined-ca-bundle\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.659814 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-httpd-config\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.761493 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m2fl\" (UniqueName: \"kubernetes.io/projected/67c635f8-b3c2-49fe-b8f3-110550f9e86d-kube-api-access-2m2fl\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.761653 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-ovndb-tls-certs\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.761696 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-config\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.761743 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-combined-ca-bundle\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.761794 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-httpd-config\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.767861 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-config\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.767876 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-ovndb-tls-certs\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.774879 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-httpd-config\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.775679 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-combined-ca-bundle\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.781250 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" path="/var/lib/kubelet/pods/af526b85-9598-43f5-90d5-bdc6f1a0eb1f/volumes" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.782637 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.786419 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m2fl\" (UniqueName: \"kubernetes.io/projected/67c635f8-b3c2-49fe-b8f3-110550f9e86d-kube-api-access-2m2fl\") pod \"neutron-5584c797bd-g5v4b\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.829011 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sj59d"] Nov 25 15:26:32 crc kubenswrapper[4965]: I1125 15:26:32.863865 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:33 crc kubenswrapper[4965]: I1125 15:26:33.136771 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe835dc-8361-426b-931b-96a5f52d8743","Type":"ContainerStarted","Data":"d549e575ad2ce1815025007924d0f8471e010b51e11aa5da478797b2f70db9f5"} Nov 25 15:26:33 crc kubenswrapper[4965]: I1125 15:26:33.157714 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sj59d" event={"ID":"3aac4c8e-9a2b-4268-b9c9-c2920b585f64","Type":"ContainerStarted","Data":"b8e3551fd346e2a391d92ef38d4799350aa5efcc2cd58d2fc6285cef38ebf8e0"} Nov 25 15:26:33 crc kubenswrapper[4965]: I1125 15:26:33.157751 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sj59d" event={"ID":"3aac4c8e-9a2b-4268-b9c9-c2920b585f64","Type":"ContainerStarted","Data":"68945cf5110338c70acf5fc1ca6100bf355e0dee41d02fb9aeba71a4e7195e48"} Nov 25 15:26:33 crc kubenswrapper[4965]: I1125 15:26:33.168452 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gpzgb" event={"ID":"ff456420-44c5-4945-925c-2f36ae44aad3","Type":"ContainerStarted","Data":"a7c8975209991a5252c93faab063aef4580adffd72d12bc295805db554186c6d"} Nov 25 15:26:33 crc kubenswrapper[4965]: I1125 15:26:33.186932 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rg5sn" event={"ID":"00430c42-5b8c-45e7-97d3-c4c256468678","Type":"ContainerStarted","Data":"e53e124612fc63f48237b9a3c25ef382c79922eea0f8f23723f10bfddfd2698d"} Nov 25 15:26:33 crc kubenswrapper[4965]: E1125 15:26:33.189068 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-lmtqd" podUID="78768f1b-d9d5-4124-8fe5-bc4b357605ca" Nov 25 15:26:33 crc kubenswrapper[4965]: I1125 15:26:33.209197 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-gpzgb" podStartSLOduration=2.7012039249999997 podStartE2EDuration="38.209177501s" podCreationTimestamp="2025-11-25 15:25:55 +0000 UTC" firstStartedPulling="2025-11-25 15:25:56.627641723 +0000 UTC m=+1301.595235469" lastFinishedPulling="2025-11-25 15:26:32.135615299 +0000 UTC m=+1337.103209045" observedRunningTime="2025-11-25 15:26:33.208040751 +0000 UTC m=+1338.175634497" watchObservedRunningTime="2025-11-25 15:26:33.209177501 +0000 UTC m=+1338.176771247" Nov 25 15:26:33 crc kubenswrapper[4965]: I1125 15:26:33.213748 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sj59d" podStartSLOduration=15.213736746 podStartE2EDuration="15.213736746s" podCreationTimestamp="2025-11-25 15:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:26:33.184666175 +0000 UTC m=+1338.152259921" watchObservedRunningTime="2025-11-25 15:26:33.213736746 +0000 UTC m=+1338.181330492" Nov 25 15:26:33 crc kubenswrapper[4965]: I1125 15:26:33.246081 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rg5sn" podStartSLOduration=2.8604885490000003 podStartE2EDuration="38.246063626s" podCreationTimestamp="2025-11-25 15:25:55 +0000 UTC" firstStartedPulling="2025-11-25 15:25:56.745471939 +0000 UTC m=+1301.713065685" lastFinishedPulling="2025-11-25 15:26:32.131047006 +0000 UTC m=+1337.098640762" observedRunningTime="2025-11-25 15:26:33.2414733 +0000 UTC m=+1338.209067046" watchObservedRunningTime="2025-11-25 15:26:33.246063626 +0000 UTC m=+1338.213657372" Nov 25 15:26:33 crc kubenswrapper[4965]: I1125 15:26:33.368495 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb44dcd7c-j7d8l"] Nov 25 15:26:33 crc kubenswrapper[4965]: I1125 15:26:33.832507 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-bn4gq" podUID="af526b85-9598-43f5-90d5-bdc6f1a0eb1f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Nov 25 15:26:34 crc kubenswrapper[4965]: I1125 15:26:34.209198 4965 generic.go:334] "Generic (PLEG): container finished" podID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" containerID="4eb91395aa46b41e81ae1bb77e9df4cf9202a68ef90745acfa8a62c39904043a" exitCode=0 Nov 25 15:26:34 crc kubenswrapper[4965]: I1125 15:26:34.210692 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" event={"ID":"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1","Type":"ContainerDied","Data":"4eb91395aa46b41e81ae1bb77e9df4cf9202a68ef90745acfa8a62c39904043a"} Nov 25 15:26:34 crc kubenswrapper[4965]: I1125 15:26:34.210716 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" event={"ID":"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1","Type":"ContainerStarted","Data":"4e03e0b982417b3e10599063bf30074f42148e627854dc5bb6c50aab4b5d621b"} Nov 25 15:26:34 crc kubenswrapper[4965]: I1125 15:26:34.327295 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5584c797bd-g5v4b"] Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.239672 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe835dc-8361-426b-931b-96a5f52d8743","Type":"ContainerStarted","Data":"c51e4db9685e7a431625e15a208bab8d5ee0f4920ccd03e2ff6584c25927f159"} Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.244580 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5584c797bd-g5v4b" event={"ID":"67c635f8-b3c2-49fe-b8f3-110550f9e86d","Type":"ContainerStarted","Data":"82d21146ff024b4028113639af6015b56a4ee884b1ded69f72a8701ac83bf0c5"} Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.244623 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5584c797bd-g5v4b" event={"ID":"67c635f8-b3c2-49fe-b8f3-110550f9e86d","Type":"ContainerStarted","Data":"c19dc1290439cf0df01092ace1c49007a3439744671a1aa8bece360a66fe2264"} Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.248234 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" event={"ID":"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1","Type":"ContainerStarted","Data":"9a0644deafab5ad319904ce6375da97734e9a6042a5e95858e563bc2a0a66275"} Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.249792 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.273030 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-754f5b77b5-wngzm"] Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.274684 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.278591 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.288565 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.298220 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" podStartSLOduration=3.2982008130000002 podStartE2EDuration="3.298200813s" podCreationTimestamp="2025-11-25 15:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:26:35.288551111 +0000 UTC m=+1340.256144857" watchObservedRunningTime="2025-11-25 15:26:35.298200813 +0000 UTC m=+1340.265794559" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.304672 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-754f5b77b5-wngzm"] Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.359923 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f4fs\" (UniqueName: \"kubernetes.io/projected/b65d1464-decb-4a38-8d9c-863605da10e1-kube-api-access-2f4fs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.360199 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-combined-ca-bundle\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.360305 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-internal-tls-certs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.360400 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-config\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.360481 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-ovndb-tls-certs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.360575 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-httpd-config\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.360648 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-public-tls-certs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.461942 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f4fs\" (UniqueName: \"kubernetes.io/projected/b65d1464-decb-4a38-8d9c-863605da10e1-kube-api-access-2f4fs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.462028 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-combined-ca-bundle\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.462053 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-internal-tls-certs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.462079 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-config\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.462103 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-ovndb-tls-certs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.462145 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-httpd-config\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.462179 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-public-tls-certs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.471208 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-public-tls-certs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.471217 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-combined-ca-bundle\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.472201 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-config\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.472803 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-internal-tls-certs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.472808 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-ovndb-tls-certs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.473529 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b65d1464-decb-4a38-8d9c-863605da10e1-httpd-config\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.488452 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f4fs\" (UniqueName: \"kubernetes.io/projected/b65d1464-decb-4a38-8d9c-863605da10e1-kube-api-access-2f4fs\") pod \"neutron-754f5b77b5-wngzm\" (UID: \"b65d1464-decb-4a38-8d9c-863605da10e1\") " pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:35 crc kubenswrapper[4965]: I1125 15:26:35.601180 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:36 crc kubenswrapper[4965]: I1125 15:26:36.155445 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-754f5b77b5-wngzm"] Nov 25 15:26:36 crc kubenswrapper[4965]: W1125 15:26:36.168932 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb65d1464_decb_4a38_8d9c_863605da10e1.slice/crio-b8b6beb3add789f67b5525fa5190fe9fd5000a97218baa053e50d01496ea5f00 WatchSource:0}: Error finding container b8b6beb3add789f67b5525fa5190fe9fd5000a97218baa053e50d01496ea5f00: Status 404 returned error can't find the container with id b8b6beb3add789f67b5525fa5190fe9fd5000a97218baa053e50d01496ea5f00 Nov 25 15:26:36 crc kubenswrapper[4965]: I1125 15:26:36.263756 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754f5b77b5-wngzm" event={"ID":"b65d1464-decb-4a38-8d9c-863605da10e1","Type":"ContainerStarted","Data":"b8b6beb3add789f67b5525fa5190fe9fd5000a97218baa053e50d01496ea5f00"} Nov 25 15:26:36 crc kubenswrapper[4965]: I1125 15:26:36.268200 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5584c797bd-g5v4b" event={"ID":"67c635f8-b3c2-49fe-b8f3-110550f9e86d","Type":"ContainerStarted","Data":"24f9340243084c7d5873a3f452bdbb9edea5f40fb75191930c2473bdee6e0a84"} Nov 25 15:26:36 crc kubenswrapper[4965]: I1125 15:26:36.304469 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5584c797bd-g5v4b" podStartSLOduration=4.304440822 podStartE2EDuration="4.304440822s" podCreationTimestamp="2025-11-25 15:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:26:36.29183921 +0000 UTC m=+1341.259432956" watchObservedRunningTime="2025-11-25 15:26:36.304440822 +0000 UTC m=+1341.272034568" Nov 25 15:26:37 crc kubenswrapper[4965]: I1125 15:26:37.283033 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754f5b77b5-wngzm" event={"ID":"b65d1464-decb-4a38-8d9c-863605da10e1","Type":"ContainerStarted","Data":"a4b2cb84a82884aba92b2a616c82459e7ae4885498ad60728c2aaad459646886"} Nov 25 15:26:37 crc kubenswrapper[4965]: I1125 15:26:37.283416 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:26:37 crc kubenswrapper[4965]: I1125 15:26:37.283456 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754f5b77b5-wngzm" event={"ID":"b65d1464-decb-4a38-8d9c-863605da10e1","Type":"ContainerStarted","Data":"e1ae66c33e93e5aa27338b6e42f721b5d487c1f10c817b7fefaf01ab5258e7d8"} Nov 25 15:26:37 crc kubenswrapper[4965]: I1125 15:26:37.283554 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:26:37 crc kubenswrapper[4965]: I1125 15:26:37.308311 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-754f5b77b5-wngzm" podStartSLOduration=2.308283817 podStartE2EDuration="2.308283817s" podCreationTimestamp="2025-11-25 15:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:26:37.305881631 +0000 UTC m=+1342.273475377" watchObservedRunningTime="2025-11-25 15:26:37.308283817 +0000 UTC m=+1342.275877563" Nov 25 15:26:42 crc kubenswrapper[4965]: I1125 15:26:42.784150 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:26:42 crc kubenswrapper[4965]: I1125 15:26:42.862270 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xfd5f"] Nov 25 15:26:42 crc kubenswrapper[4965]: I1125 15:26:42.862565 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" podUID="a2582ae0-ed83-440f-a1e0-2e9e65d9a005" containerName="dnsmasq-dns" containerID="cri-o://1cc687b285ecf780f7cff178a9ca5f5aebe6d6286b6e609dd51939879af1c139" gracePeriod=10 Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.344250 4965 generic.go:334] "Generic (PLEG): container finished" podID="ff456420-44c5-4945-925c-2f36ae44aad3" containerID="a7c8975209991a5252c93faab063aef4580adffd72d12bc295805db554186c6d" exitCode=0 Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.344950 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gpzgb" event={"ID":"ff456420-44c5-4945-925c-2f36ae44aad3","Type":"ContainerDied","Data":"a7c8975209991a5252c93faab063aef4580adffd72d12bc295805db554186c6d"} Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.353000 4965 generic.go:334] "Generic (PLEG): container finished" podID="a2582ae0-ed83-440f-a1e0-2e9e65d9a005" containerID="1cc687b285ecf780f7cff178a9ca5f5aebe6d6286b6e609dd51939879af1c139" exitCode=0 Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.353252 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" event={"ID":"a2582ae0-ed83-440f-a1e0-2e9e65d9a005","Type":"ContainerDied","Data":"1cc687b285ecf780f7cff178a9ca5f5aebe6d6286b6e609dd51939879af1c139"} Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.368283 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe835dc-8361-426b-931b-96a5f52d8743","Type":"ContainerStarted","Data":"86f6e2e46fc72758cfe5c6cf8334ddfece914ce3a9df52a023fe0e2f72460281"} Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.454714 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.581282 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-dns-svc\") pod \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.581446 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xddcf\" (UniqueName: \"kubernetes.io/projected/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-kube-api-access-xddcf\") pod \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.581483 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-config\") pod \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.581539 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-sb\") pod \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.581620 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-nb\") pod \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\" (UID: \"a2582ae0-ed83-440f-a1e0-2e9e65d9a005\") " Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.599705 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-kube-api-access-xddcf" (OuterVolumeSpecName: "kube-api-access-xddcf") pod "a2582ae0-ed83-440f-a1e0-2e9e65d9a005" (UID: "a2582ae0-ed83-440f-a1e0-2e9e65d9a005"). InnerVolumeSpecName "kube-api-access-xddcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.637698 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a2582ae0-ed83-440f-a1e0-2e9e65d9a005" (UID: "a2582ae0-ed83-440f-a1e0-2e9e65d9a005"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.647475 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-config" (OuterVolumeSpecName: "config") pod "a2582ae0-ed83-440f-a1e0-2e9e65d9a005" (UID: "a2582ae0-ed83-440f-a1e0-2e9e65d9a005"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.649940 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2582ae0-ed83-440f-a1e0-2e9e65d9a005" (UID: "a2582ae0-ed83-440f-a1e0-2e9e65d9a005"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.660814 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a2582ae0-ed83-440f-a1e0-2e9e65d9a005" (UID: "a2582ae0-ed83-440f-a1e0-2e9e65d9a005"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.683983 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.684197 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.684282 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.684343 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xddcf\" (UniqueName: \"kubernetes.io/projected/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-kube-api-access-xddcf\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:43 crc kubenswrapper[4965]: I1125 15:26:43.684395 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2582ae0-ed83-440f-a1e0-2e9e65d9a005-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.380549 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" event={"ID":"a2582ae0-ed83-440f-a1e0-2e9e65d9a005","Type":"ContainerDied","Data":"656a2c8543d7606ed356ab139c371a90f6960680e2b5f43b4de655adc3d426c9"} Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.380600 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xfd5f" Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.380774 4965 scope.go:117] "RemoveContainer" containerID="1cc687b285ecf780f7cff178a9ca5f5aebe6d6286b6e609dd51939879af1c139" Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.383559 4965 generic.go:334] "Generic (PLEG): container finished" podID="00430c42-5b8c-45e7-97d3-c4c256468678" containerID="e53e124612fc63f48237b9a3c25ef382c79922eea0f8f23723f10bfddfd2698d" exitCode=0 Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.383762 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rg5sn" event={"ID":"00430c42-5b8c-45e7-97d3-c4c256468678","Type":"ContainerDied","Data":"e53e124612fc63f48237b9a3c25ef382c79922eea0f8f23723f10bfddfd2698d"} Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.452208 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xfd5f"] Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.452275 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xfd5f"] Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.481180 4965 scope.go:117] "RemoveContainer" containerID="d25930367d8c8f1b1a77995eb25797cb8e5443026ddbf15ecc5abcc611111475" Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.784208 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2582ae0-ed83-440f-a1e0-2e9e65d9a005" path="/var/lib/kubelet/pods/a2582ae0-ed83-440f-a1e0-2e9e65d9a005/volumes" Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.823427 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gpzgb" Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.906785 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-config-data\") pod \"ff456420-44c5-4945-925c-2f36ae44aad3\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.906860 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx4dh\" (UniqueName: \"kubernetes.io/projected/ff456420-44c5-4945-925c-2f36ae44aad3-kube-api-access-jx4dh\") pod \"ff456420-44c5-4945-925c-2f36ae44aad3\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.906937 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-scripts\") pod \"ff456420-44c5-4945-925c-2f36ae44aad3\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.907008 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-combined-ca-bundle\") pod \"ff456420-44c5-4945-925c-2f36ae44aad3\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.907099 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff456420-44c5-4945-925c-2f36ae44aad3-logs\") pod \"ff456420-44c5-4945-925c-2f36ae44aad3\" (UID: \"ff456420-44c5-4945-925c-2f36ae44aad3\") " Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.908773 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff456420-44c5-4945-925c-2f36ae44aad3-logs" (OuterVolumeSpecName: "logs") pod "ff456420-44c5-4945-925c-2f36ae44aad3" (UID: "ff456420-44c5-4945-925c-2f36ae44aad3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.912443 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff456420-44c5-4945-925c-2f36ae44aad3-kube-api-access-jx4dh" (OuterVolumeSpecName: "kube-api-access-jx4dh") pod "ff456420-44c5-4945-925c-2f36ae44aad3" (UID: "ff456420-44c5-4945-925c-2f36ae44aad3"). InnerVolumeSpecName "kube-api-access-jx4dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.913577 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-scripts" (OuterVolumeSpecName: "scripts") pod "ff456420-44c5-4945-925c-2f36ae44aad3" (UID: "ff456420-44c5-4945-925c-2f36ae44aad3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.937283 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff456420-44c5-4945-925c-2f36ae44aad3" (UID: "ff456420-44c5-4945-925c-2f36ae44aad3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:44 crc kubenswrapper[4965]: I1125 15:26:44.937996 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-config-data" (OuterVolumeSpecName: "config-data") pod "ff456420-44c5-4945-925c-2f36ae44aad3" (UID: "ff456420-44c5-4945-925c-2f36ae44aad3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.010225 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.010263 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx4dh\" (UniqueName: \"kubernetes.io/projected/ff456420-44c5-4945-925c-2f36ae44aad3-kube-api-access-jx4dh\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.010273 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.010281 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff456420-44c5-4945-925c-2f36ae44aad3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.010290 4965 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff456420-44c5-4945-925c-2f36ae44aad3-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.402553 4965 generic.go:334] "Generic (PLEG): container finished" podID="3aac4c8e-9a2b-4268-b9c9-c2920b585f64" containerID="b8e3551fd346e2a391d92ef38d4799350aa5efcc2cd58d2fc6285cef38ebf8e0" exitCode=0 Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.402639 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sj59d" event={"ID":"3aac4c8e-9a2b-4268-b9c9-c2920b585f64","Type":"ContainerDied","Data":"b8e3551fd346e2a391d92ef38d4799350aa5efcc2cd58d2fc6285cef38ebf8e0"} Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.418549 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gpzgb" event={"ID":"ff456420-44c5-4945-925c-2f36ae44aad3","Type":"ContainerDied","Data":"dbbd4bd8b2c5fbdf4c52b2018f31bda79cc7291b9c59513c406d545b25802581"} Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.418641 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbbd4bd8b2c5fbdf4c52b2018f31bda79cc7291b9c59513c406d545b25802581" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.418761 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gpzgb" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.502214 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55bb9cdb94-946lv"] Nov 25 15:26:45 crc kubenswrapper[4965]: E1125 15:26:45.503322 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2582ae0-ed83-440f-a1e0-2e9e65d9a005" containerName="init" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.503345 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2582ae0-ed83-440f-a1e0-2e9e65d9a005" containerName="init" Nov 25 15:26:45 crc kubenswrapper[4965]: E1125 15:26:45.503367 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2582ae0-ed83-440f-a1e0-2e9e65d9a005" containerName="dnsmasq-dns" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.503373 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2582ae0-ed83-440f-a1e0-2e9e65d9a005" containerName="dnsmasq-dns" Nov 25 15:26:45 crc kubenswrapper[4965]: E1125 15:26:45.503381 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff456420-44c5-4945-925c-2f36ae44aad3" containerName="placement-db-sync" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.503386 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff456420-44c5-4945-925c-2f36ae44aad3" containerName="placement-db-sync" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.503541 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2582ae0-ed83-440f-a1e0-2e9e65d9a005" containerName="dnsmasq-dns" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.503556 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff456420-44c5-4945-925c-2f36ae44aad3" containerName="placement-db-sync" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.507094 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.517630 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.517747 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.518070 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.524608 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tj7pv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.533696 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.537163 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55bb9cdb94-946lv"] Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.623348 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-scripts\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.623671 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-internal-tls-certs\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.623718 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-public-tls-certs\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.623754 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-config-data\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.623769 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58f31396-3b21-4e2e-981e-32196692ab5d-logs\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.623807 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2bnm\" (UniqueName: \"kubernetes.io/projected/58f31396-3b21-4e2e-981e-32196692ab5d-kube-api-access-z2bnm\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.623831 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-combined-ca-bundle\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.726658 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-config-data\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.726734 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58f31396-3b21-4e2e-981e-32196692ab5d-logs\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.726813 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2bnm\" (UniqueName: \"kubernetes.io/projected/58f31396-3b21-4e2e-981e-32196692ab5d-kube-api-access-z2bnm\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.726842 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-combined-ca-bundle\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.726914 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-scripts\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.726990 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-internal-tls-certs\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.727078 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-public-tls-certs\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.727662 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58f31396-3b21-4e2e-981e-32196692ab5d-logs\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.734417 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-internal-tls-certs\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.735069 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-config-data\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.739384 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-scripts\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.739515 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-public-tls-certs\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.744330 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2bnm\" (UniqueName: \"kubernetes.io/projected/58f31396-3b21-4e2e-981e-32196692ab5d-kube-api-access-z2bnm\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.767492 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f31396-3b21-4e2e-981e-32196692ab5d-combined-ca-bundle\") pod \"placement-55bb9cdb94-946lv\" (UID: \"58f31396-3b21-4e2e-981e-32196692ab5d\") " pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.829508 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.841451 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.934622 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m57h\" (UniqueName: \"kubernetes.io/projected/00430c42-5b8c-45e7-97d3-c4c256468678-kube-api-access-7m57h\") pod \"00430c42-5b8c-45e7-97d3-c4c256468678\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.934728 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-db-sync-config-data\") pod \"00430c42-5b8c-45e7-97d3-c4c256468678\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.934804 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-combined-ca-bundle\") pod \"00430c42-5b8c-45e7-97d3-c4c256468678\" (UID: \"00430c42-5b8c-45e7-97d3-c4c256468678\") " Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.943605 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "00430c42-5b8c-45e7-97d3-c4c256468678" (UID: "00430c42-5b8c-45e7-97d3-c4c256468678"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.948131 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00430c42-5b8c-45e7-97d3-c4c256468678-kube-api-access-7m57h" (OuterVolumeSpecName: "kube-api-access-7m57h") pod "00430c42-5b8c-45e7-97d3-c4c256468678" (UID: "00430c42-5b8c-45e7-97d3-c4c256468678"). InnerVolumeSpecName "kube-api-access-7m57h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:26:45 crc kubenswrapper[4965]: I1125 15:26:45.968189 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00430c42-5b8c-45e7-97d3-c4c256468678" (UID: "00430c42-5b8c-45e7-97d3-c4c256468678"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.038206 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m57h\" (UniqueName: \"kubernetes.io/projected/00430c42-5b8c-45e7-97d3-c4c256468678-kube-api-access-7m57h\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.038580 4965 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.038600 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00430c42-5b8c-45e7-97d3-c4c256468678-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.407756 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55bb9cdb94-946lv"] Nov 25 15:26:46 crc kubenswrapper[4965]: W1125 15:26:46.422352 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58f31396_3b21_4e2e_981e_32196692ab5d.slice/crio-833bdd8c5c602eb497e02c0e1a1827524083bef762e1b3fb877d3565f586537d WatchSource:0}: Error finding container 833bdd8c5c602eb497e02c0e1a1827524083bef762e1b3fb877d3565f586537d: Status 404 returned error can't find the container with id 833bdd8c5c602eb497e02c0e1a1827524083bef762e1b3fb877d3565f586537d Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.440755 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rg5sn" event={"ID":"00430c42-5b8c-45e7-97d3-c4c256468678","Type":"ContainerDied","Data":"a9f8a43c523d05fead483eec2e02da8a9c6f63a0adb4b4a013332838bb29cf54"} Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.440795 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9f8a43c523d05fead483eec2e02da8a9c6f63a0adb4b4a013332838bb29cf54" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.440855 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rg5sn" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.454133 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55bb9cdb94-946lv" event={"ID":"58f31396-3b21-4e2e-981e-32196692ab5d","Type":"ContainerStarted","Data":"833bdd8c5c602eb497e02c0e1a1827524083bef762e1b3fb877d3565f586537d"} Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.671758 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b86cffbd7-spglj"] Nov 25 15:26:46 crc kubenswrapper[4965]: E1125 15:26:46.673649 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00430c42-5b8c-45e7-97d3-c4c256468678" containerName="barbican-db-sync" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.675135 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="00430c42-5b8c-45e7-97d3-c4c256468678" containerName="barbican-db-sync" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.675459 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="00430c42-5b8c-45e7-97d3-c4c256468678" containerName="barbican-db-sync" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.676383 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.683455 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.684677 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gsxq7" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.697892 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.732640 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5bf85454d4-vsrlb"] Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.734040 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.752600 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.755734 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7960ffe1-ecc7-4e83-9255-8f57e8707289-config-data\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.755908 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7960ffe1-ecc7-4e83-9255-8f57e8707289-logs\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.756096 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7960ffe1-ecc7-4e83-9255-8f57e8707289-combined-ca-bundle\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.756190 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7960ffe1-ecc7-4e83-9255-8f57e8707289-config-data-custom\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.756310 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl77p\" (UniqueName: \"kubernetes.io/projected/7960ffe1-ecc7-4e83-9255-8f57e8707289-kube-api-access-tl77p\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.769865 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b86cffbd7-spglj"] Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.830705 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5bf85454d4-vsrlb"] Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.877426 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-config-data-custom\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.877577 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl77p\" (UniqueName: \"kubernetes.io/projected/7960ffe1-ecc7-4e83-9255-8f57e8707289-kube-api-access-tl77p\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.877737 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7960ffe1-ecc7-4e83-9255-8f57e8707289-config-data\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.877761 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7960ffe1-ecc7-4e83-9255-8f57e8707289-logs\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.877796 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-config-data\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.877861 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-combined-ca-bundle\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.882639 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpzwc\" (UniqueName: \"kubernetes.io/projected/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-kube-api-access-lpzwc\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.882791 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-logs\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.882878 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7960ffe1-ecc7-4e83-9255-8f57e8707289-combined-ca-bundle\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.883330 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7960ffe1-ecc7-4e83-9255-8f57e8707289-config-data-custom\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.886987 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7960ffe1-ecc7-4e83-9255-8f57e8707289-logs\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.888908 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7960ffe1-ecc7-4e83-9255-8f57e8707289-config-data\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.890801 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7960ffe1-ecc7-4e83-9255-8f57e8707289-combined-ca-bundle\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.947692 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7960ffe1-ecc7-4e83-9255-8f57e8707289-config-data-custom\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.982192 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl77p\" (UniqueName: \"kubernetes.io/projected/7960ffe1-ecc7-4e83-9255-8f57e8707289-kube-api-access-tl77p\") pod \"barbican-worker-6b86cffbd7-spglj\" (UID: \"7960ffe1-ecc7-4e83-9255-8f57e8707289\") " pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.986861 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-config-data\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.987190 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-combined-ca-bundle\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.987329 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpzwc\" (UniqueName: \"kubernetes.io/projected/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-kube-api-access-lpzwc\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.987492 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-logs\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.987684 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-config-data-custom\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:46 crc kubenswrapper[4965]: I1125 15:26:46.992409 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-logs\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.009703 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-combined-ca-bundle\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.010417 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b86cffbd7-spglj" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.012567 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-config-data-custom\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.022702 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-config-data\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.079714 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpzwc\" (UniqueName: \"kubernetes.io/projected/6a6c4bed-60b7-41ee-b4b0-412bb3e25989-kube-api-access-lpzwc\") pod \"barbican-keystone-listener-5bf85454d4-vsrlb\" (UID: \"6a6c4bed-60b7-41ee-b4b0-412bb3e25989\") " pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.085088 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66b4f48597-hxm64"] Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.086485 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.101459 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.103056 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b4f48597-hxm64"] Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.191757 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-config\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.191803 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-sb\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.191844 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-dns-svc\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.191865 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-nb\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.191901 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56w9n\" (UniqueName: \"kubernetes.io/projected/e37fd35e-99e8-4837-87f9-97a5dd1664bd-kube-api-access-56w9n\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.192003 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-759b699686-scwl8"] Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.193303 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.194979 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.205049 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-759b699686-scwl8"] Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.299479 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.299627 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-config\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.299683 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-sb\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.299788 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-dns-svc\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.299842 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-nb\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.299866 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-combined-ca-bundle\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.299949 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56w9n\" (UniqueName: \"kubernetes.io/projected/e37fd35e-99e8-4837-87f9-97a5dd1664bd-kube-api-access-56w9n\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.300113 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-logs\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.300180 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data-custom\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.300196 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qqcl\" (UniqueName: \"kubernetes.io/projected/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-kube-api-access-9qqcl\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.300750 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-nb\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.301344 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-dns-svc\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.301649 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-config\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.335384 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-sb\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.340595 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56w9n\" (UniqueName: \"kubernetes.io/projected/e37fd35e-99e8-4837-87f9-97a5dd1664bd-kube-api-access-56w9n\") pod \"dnsmasq-dns-66b4f48597-hxm64\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.402682 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-combined-ca-bundle\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.403089 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-logs\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.403121 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data-custom\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.403136 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qqcl\" (UniqueName: \"kubernetes.io/projected/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-kube-api-access-9qqcl\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.403157 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.404189 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-logs\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.412091 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-combined-ca-bundle\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.415331 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.415678 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data-custom\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.431592 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qqcl\" (UniqueName: \"kubernetes.io/projected/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-kube-api-access-9qqcl\") pod \"barbican-api-759b699686-scwl8\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.433545 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.505795 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-scripts\") pod \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.505837 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dkh6\" (UniqueName: \"kubernetes.io/projected/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-kube-api-access-5dkh6\") pod \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.505867 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-fernet-keys\") pod \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.505883 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-combined-ca-bundle\") pod \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.505923 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-credential-keys\") pod \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.506018 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-config-data\") pod \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\" (UID: \"3aac4c8e-9a2b-4268-b9c9-c2920b585f64\") " Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.514541 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-kube-api-access-5dkh6" (OuterVolumeSpecName: "kube-api-access-5dkh6") pod "3aac4c8e-9a2b-4268-b9c9-c2920b585f64" (UID: "3aac4c8e-9a2b-4268-b9c9-c2920b585f64"). InnerVolumeSpecName "kube-api-access-5dkh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.539674 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-scripts" (OuterVolumeSpecName: "scripts") pod "3aac4c8e-9a2b-4268-b9c9-c2920b585f64" (UID: "3aac4c8e-9a2b-4268-b9c9-c2920b585f64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.540249 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3aac4c8e-9a2b-4268-b9c9-c2920b585f64" (UID: "3aac4c8e-9a2b-4268-b9c9-c2920b585f64"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.542162 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55bb9cdb94-946lv" event={"ID":"58f31396-3b21-4e2e-981e-32196692ab5d","Type":"ContainerStarted","Data":"202646be114896da652f14ba86e4d9744070695fbf74fc8a6a92a7ff8b9923af"} Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.545478 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3aac4c8e-9a2b-4268-b9c9-c2920b585f64" (UID: "3aac4c8e-9a2b-4268-b9c9-c2920b585f64"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.548130 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3aac4c8e-9a2b-4268-b9c9-c2920b585f64" (UID: "3aac4c8e-9a2b-4268-b9c9-c2920b585f64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.572181 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-config-data" (OuterVolumeSpecName: "config-data") pod "3aac4c8e-9a2b-4268-b9c9-c2920b585f64" (UID: "3aac4c8e-9a2b-4268-b9c9-c2920b585f64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.575041 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sj59d" event={"ID":"3aac4c8e-9a2b-4268-b9c9-c2920b585f64","Type":"ContainerDied","Data":"68945cf5110338c70acf5fc1ca6100bf355e0dee41d02fb9aeba71a4e7195e48"} Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.575078 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68945cf5110338c70acf5fc1ca6100bf355e0dee41d02fb9aeba71a4e7195e48" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.575145 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sj59d" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.592602 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.611935 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.611984 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dkh6\" (UniqueName: \"kubernetes.io/projected/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-kube-api-access-5dkh6\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.611996 4965 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.612007 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.612016 4965 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.612027 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aac4c8e-9a2b-4268-b9c9-c2920b585f64-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.641248 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:47 crc kubenswrapper[4965]: I1125 15:26:47.910763 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b86cffbd7-spglj"] Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.166059 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5bf85454d4-vsrlb"] Nov 25 15:26:48 crc kubenswrapper[4965]: E1125 15:26:48.167234 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aac4c8e_9a2b_4268_b9c9_c2920b585f64.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aac4c8e_9a2b_4268_b9c9_c2920b585f64.slice/crio-68945cf5110338c70acf5fc1ca6100bf355e0dee41d02fb9aeba71a4e7195e48\": RecentStats: unable to find data in memory cache]" Nov 25 15:26:48 crc kubenswrapper[4965]: W1125 15:26:48.185731 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a6c4bed_60b7_41ee_b4b0_412bb3e25989.slice/crio-38ba2e872822efbdf0a3f231cea977c7fad753cc20ed1a369fc55032c6792fbe WatchSource:0}: Error finding container 38ba2e872822efbdf0a3f231cea977c7fad753cc20ed1a369fc55032c6792fbe: Status 404 returned error can't find the container with id 38ba2e872822efbdf0a3f231cea977c7fad753cc20ed1a369fc55032c6792fbe Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.512024 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-759b699686-scwl8"] Nov 25 15:26:48 crc kubenswrapper[4965]: W1125 15:26:48.522202 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ee52a3a_c2d2_46e2_9a6d_f4fc577ebceb.slice/crio-ef410f9d753f428937f00f6ae24c21e583e174550fbb345322fddcd0df4fe680 WatchSource:0}: Error finding container ef410f9d753f428937f00f6ae24c21e583e174550fbb345322fddcd0df4fe680: Status 404 returned error can't find the container with id ef410f9d753f428937f00f6ae24c21e583e174550fbb345322fddcd0df4fe680 Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.592612 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cr787" event={"ID":"0546399d-f1ee-4fe8-aa16-fb64e9f58899","Type":"ContainerStarted","Data":"7d10150452146b84ff4b8b901f36e5add7cc84ac487cefe12062b670af70c4ef"} Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.602673 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" event={"ID":"6a6c4bed-60b7-41ee-b4b0-412bb3e25989","Type":"ContainerStarted","Data":"38ba2e872822efbdf0a3f231cea977c7fad753cc20ed1a369fc55032c6792fbe"} Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.611071 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lmtqd" event={"ID":"78768f1b-d9d5-4124-8fe5-bc4b357605ca","Type":"ContainerStarted","Data":"a1966663a4a8faa990a6cfe8007baafbbf17a1beca1be322056816ffbc5a8efd"} Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.621280 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b699686-scwl8" event={"ID":"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb","Type":"ContainerStarted","Data":"ef410f9d753f428937f00f6ae24c21e583e174550fbb345322fddcd0df4fe680"} Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.627793 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55bb9cdb94-946lv" event={"ID":"58f31396-3b21-4e2e-981e-32196692ab5d","Type":"ContainerStarted","Data":"93893332ff1590948fa2790c12dbd1c7d2a879513d3989f3ad1bca6dc9cfdf30"} Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.628543 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.628578 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.630568 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b86cffbd7-spglj" event={"ID":"7960ffe1-ecc7-4e83-9255-8f57e8707289","Type":"ContainerStarted","Data":"8486c2aea0ff52e9568138ea8230cafdbba031e0acf8b9242d5718e5674f8a70"} Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.650212 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b4f48597-hxm64"] Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.682347 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-cr787" podStartSLOduration=3.290704133 podStartE2EDuration="1m20.682323211s" podCreationTimestamp="2025-11-25 15:25:28 +0000 UTC" firstStartedPulling="2025-11-25 15:25:28.977286056 +0000 UTC m=+1273.944879792" lastFinishedPulling="2025-11-25 15:26:46.368905134 +0000 UTC m=+1351.336498870" observedRunningTime="2025-11-25 15:26:48.66242302 +0000 UTC m=+1353.630016796" watchObservedRunningTime="2025-11-25 15:26:48.682323211 +0000 UTC m=+1353.649916957" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.690835 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6b6574b966-rvvgn"] Nov 25 15:26:48 crc kubenswrapper[4965]: E1125 15:26:48.691444 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aac4c8e-9a2b-4268-b9c9-c2920b585f64" containerName="keystone-bootstrap" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.691473 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aac4c8e-9a2b-4268-b9c9-c2920b585f64" containerName="keystone-bootstrap" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.691658 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aac4c8e-9a2b-4268-b9c9-c2920b585f64" containerName="keystone-bootstrap" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.700636 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: W1125 15:26:48.701261 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode37fd35e_99e8_4837_87f9_97a5dd1664bd.slice/crio-f412cb553b3fac4d9424fa3604101a1a2ff63993ceffe351be17b3ffdaa62632 WatchSource:0}: Error finding container f412cb553b3fac4d9424fa3604101a1a2ff63993ceffe351be17b3ffdaa62632: Status 404 returned error can't find the container with id f412cb553b3fac4d9424fa3604101a1a2ff63993ceffe351be17b3ffdaa62632 Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.715741 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.716132 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.716363 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.716537 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.716832 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.723847 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v47cv" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.731262 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55bb9cdb94-946lv" podStartSLOduration=3.731238812 podStartE2EDuration="3.731238812s" podCreationTimestamp="2025-11-25 15:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:26:48.702771917 +0000 UTC m=+1353.670365693" watchObservedRunningTime="2025-11-25 15:26:48.731238812 +0000 UTC m=+1353.698832548" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.767875 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-internal-tls-certs\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.767945 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-public-tls-certs\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.767989 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-scripts\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.768042 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-credential-keys\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.781183 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-config-data\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.781266 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clsnm\" (UniqueName: \"kubernetes.io/projected/137be586-5f9e-4e81-a676-b6c30c501608-kube-api-access-clsnm\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.781447 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-fernet-keys\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.781498 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-combined-ca-bundle\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.836437 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b6574b966-rvvgn"] Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.838820 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-lmtqd" podStartSLOduration=4.179981302 podStartE2EDuration="53.838794469s" podCreationTimestamp="2025-11-25 15:25:55 +0000 UTC" firstStartedPulling="2025-11-25 15:25:56.621056094 +0000 UTC m=+1301.588649840" lastFinishedPulling="2025-11-25 15:26:46.279869261 +0000 UTC m=+1351.247463007" observedRunningTime="2025-11-25 15:26:48.752699866 +0000 UTC m=+1353.720293612" watchObservedRunningTime="2025-11-25 15:26:48.838794469 +0000 UTC m=+1353.806388215" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.892239 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-public-tls-certs\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.892299 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-scripts\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.892342 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-credential-keys\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.892375 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-config-data\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.892398 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clsnm\" (UniqueName: \"kubernetes.io/projected/137be586-5f9e-4e81-a676-b6c30c501608-kube-api-access-clsnm\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.892436 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-fernet-keys\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.892455 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-combined-ca-bundle\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.892511 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-internal-tls-certs\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.914071 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-internal-tls-certs\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.915138 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-combined-ca-bundle\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.915446 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-config-data\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.916255 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-scripts\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.920250 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-fernet-keys\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.928124 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clsnm\" (UniqueName: \"kubernetes.io/projected/137be586-5f9e-4e81-a676-b6c30c501608-kube-api-access-clsnm\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.928491 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-public-tls-certs\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:48 crc kubenswrapper[4965]: I1125 15:26:48.931310 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/137be586-5f9e-4e81-a676-b6c30c501608-credential-keys\") pod \"keystone-6b6574b966-rvvgn\" (UID: \"137be586-5f9e-4e81-a676-b6c30c501608\") " pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:49 crc kubenswrapper[4965]: I1125 15:26:49.056561 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:49 crc kubenswrapper[4965]: I1125 15:26:49.662408 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b699686-scwl8" event={"ID":"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb","Type":"ContainerStarted","Data":"e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771"} Nov 25 15:26:49 crc kubenswrapper[4965]: I1125 15:26:49.676663 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" event={"ID":"e37fd35e-99e8-4837-87f9-97a5dd1664bd","Type":"ContainerStarted","Data":"2fbe0f0996a30cc98011f9fa28e3b412f84809219b072dadcf3746d5415197d3"} Nov 25 15:26:49 crc kubenswrapper[4965]: I1125 15:26:49.676715 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" event={"ID":"e37fd35e-99e8-4837-87f9-97a5dd1664bd","Type":"ContainerStarted","Data":"f412cb553b3fac4d9424fa3604101a1a2ff63993ceffe351be17b3ffdaa62632"} Nov 25 15:26:49 crc kubenswrapper[4965]: I1125 15:26:49.784002 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b6574b966-rvvgn"] Nov 25 15:26:49 crc kubenswrapper[4965]: W1125 15:26:49.789840 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod137be586_5f9e_4e81_a676_b6c30c501608.slice/crio-aa4cda5aad6ab13580cef25814a63ef6b44443e0b909011e52349472e90ac7b6 WatchSource:0}: Error finding container aa4cda5aad6ab13580cef25814a63ef6b44443e0b909011e52349472e90ac7b6: Status 404 returned error can't find the container with id aa4cda5aad6ab13580cef25814a63ef6b44443e0b909011e52349472e90ac7b6 Nov 25 15:26:50 crc kubenswrapper[4965]: I1125 15:26:50.703952 4965 generic.go:334] "Generic (PLEG): container finished" podID="e37fd35e-99e8-4837-87f9-97a5dd1664bd" containerID="2fbe0f0996a30cc98011f9fa28e3b412f84809219b072dadcf3746d5415197d3" exitCode=0 Nov 25 15:26:50 crc kubenswrapper[4965]: I1125 15:26:50.704131 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" event={"ID":"e37fd35e-99e8-4837-87f9-97a5dd1664bd","Type":"ContainerDied","Data":"2fbe0f0996a30cc98011f9fa28e3b412f84809219b072dadcf3746d5415197d3"} Nov 25 15:26:50 crc kubenswrapper[4965]: I1125 15:26:50.714418 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b699686-scwl8" event={"ID":"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb","Type":"ContainerStarted","Data":"09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464"} Nov 25 15:26:50 crc kubenswrapper[4965]: I1125 15:26:50.714576 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:50 crc kubenswrapper[4965]: I1125 15:26:50.714613 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:26:50 crc kubenswrapper[4965]: I1125 15:26:50.748733 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b6574b966-rvvgn" event={"ID":"137be586-5f9e-4e81-a676-b6c30c501608","Type":"ContainerStarted","Data":"5752821399a2a2bb99b9200c5d6a4ecc6d64790052911912b8962a8724e426c2"} Nov 25 15:26:50 crc kubenswrapper[4965]: I1125 15:26:50.748783 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b6574b966-rvvgn" event={"ID":"137be586-5f9e-4e81-a676-b6c30c501608","Type":"ContainerStarted","Data":"aa4cda5aad6ab13580cef25814a63ef6b44443e0b909011e52349472e90ac7b6"} Nov 25 15:26:50 crc kubenswrapper[4965]: I1125 15:26:50.749536 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:26:50 crc kubenswrapper[4965]: I1125 15:26:50.764310 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-759b699686-scwl8" podStartSLOduration=3.76428853 podStartE2EDuration="3.76428853s" podCreationTimestamp="2025-11-25 15:26:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:26:50.760419466 +0000 UTC m=+1355.728013222" watchObservedRunningTime="2025-11-25 15:26:50.76428853 +0000 UTC m=+1355.731882286" Nov 25 15:26:50 crc kubenswrapper[4965]: I1125 15:26:50.791627 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6b6574b966-rvvgn" podStartSLOduration=2.791610234 podStartE2EDuration="2.791610234s" podCreationTimestamp="2025-11-25 15:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:26:50.788037427 +0000 UTC m=+1355.755631183" watchObservedRunningTime="2025-11-25 15:26:50.791610234 +0000 UTC m=+1355.759203980" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.264953 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6fdd878f78-r9vjx"] Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.267795 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.270835 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.271526 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.288003 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fdd878f78-r9vjx"] Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.368764 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-internal-tls-certs\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.368998 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-public-tls-certs\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.369046 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-combined-ca-bundle\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.369115 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xgg9\" (UniqueName: \"kubernetes.io/projected/7ab61fa8-7198-48ff-920a-39246bf7d752-kube-api-access-9xgg9\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.369133 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-config-data\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.369313 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-config-data-custom\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.369367 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab61fa8-7198-48ff-920a-39246bf7d752-logs\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.472145 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xgg9\" (UniqueName: \"kubernetes.io/projected/7ab61fa8-7198-48ff-920a-39246bf7d752-kube-api-access-9xgg9\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.472205 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-config-data\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.472251 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-config-data-custom\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.472295 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab61fa8-7198-48ff-920a-39246bf7d752-logs\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.472380 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-internal-tls-certs\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.472424 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-public-tls-certs\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.472460 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-combined-ca-bundle\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.473351 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab61fa8-7198-48ff-920a-39246bf7d752-logs\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.477382 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-internal-tls-certs\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.477432 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-public-tls-certs\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.478938 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-config-data-custom\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.479304 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-combined-ca-bundle\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.490441 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab61fa8-7198-48ff-920a-39246bf7d752-config-data\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.492771 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xgg9\" (UniqueName: \"kubernetes.io/projected/7ab61fa8-7198-48ff-920a-39246bf7d752-kube-api-access-9xgg9\") pod \"barbican-api-6fdd878f78-r9vjx\" (UID: \"7ab61fa8-7198-48ff-920a-39246bf7d752\") " pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:51 crc kubenswrapper[4965]: I1125 15:26:51.596580 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:26:53 crc kubenswrapper[4965]: I1125 15:26:53.260818 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:26:53 crc kubenswrapper[4965]: I1125 15:26:53.261176 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:26:54 crc kubenswrapper[4965]: I1125 15:26:54.235734 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fdd878f78-r9vjx"] Nov 25 15:26:54 crc kubenswrapper[4965]: I1125 15:26:54.790630 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" event={"ID":"e37fd35e-99e8-4837-87f9-97a5dd1664bd","Type":"ContainerStarted","Data":"fd17eef67b789033d7496f9201f46cb5076081113eaf11d48f90a1174002a4be"} Nov 25 15:26:54 crc kubenswrapper[4965]: I1125 15:26:54.790958 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:26:54 crc kubenswrapper[4965]: I1125 15:26:54.793885 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fdd878f78-r9vjx" event={"ID":"7ab61fa8-7198-48ff-920a-39246bf7d752","Type":"ContainerStarted","Data":"5f11184ec05a47ae54e7834c9f53e951a00f4183a702803b478b82c691b6c24c"} Nov 25 15:26:54 crc kubenswrapper[4965]: I1125 15:26:54.793925 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fdd878f78-r9vjx" event={"ID":"7ab61fa8-7198-48ff-920a-39246bf7d752","Type":"ContainerStarted","Data":"c5d245e74baa00454c084a1dd780341b932f94810f0bf4aa1d4091185ac3bc89"} Nov 25 15:26:56 crc kubenswrapper[4965]: I1125 15:26:56.800903 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" podStartSLOduration=10.80087909 podStartE2EDuration="10.80087909s" podCreationTimestamp="2025-11-25 15:26:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:26:54.817083036 +0000 UTC m=+1359.784676802" watchObservedRunningTime="2025-11-25 15:26:56.80087909 +0000 UTC m=+1361.768472836" Nov 25 15:27:01 crc kubenswrapper[4965]: I1125 15:27:01.723260 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:27:01 crc kubenswrapper[4965]: I1125 15:27:01.723235 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:27:02 crc kubenswrapper[4965]: I1125 15:27:02.600123 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:27:02 crc kubenswrapper[4965]: I1125 15:27:02.711439 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bb44dcd7c-j7d8l"] Nov 25 15:27:02 crc kubenswrapper[4965]: I1125 15:27:02.711713 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" podUID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" containerName="dnsmasq-dns" containerID="cri-o://9a0644deafab5ad319904ce6375da97734e9a6042a5e95858e563bc2a0a66275" gracePeriod=10 Nov 25 15:27:02 crc kubenswrapper[4965]: I1125 15:27:02.727591 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:27:02 crc kubenswrapper[4965]: I1125 15:27:02.728035 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:27:02 crc kubenswrapper[4965]: I1125 15:27:02.783061 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" podUID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Nov 25 15:27:02 crc kubenswrapper[4965]: I1125 15:27:02.886313 4965 generic.go:334] "Generic (PLEG): container finished" podID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" containerID="9a0644deafab5ad319904ce6375da97734e9a6042a5e95858e563bc2a0a66275" exitCode=0 Nov 25 15:27:02 crc kubenswrapper[4965]: I1125 15:27:02.886547 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" event={"ID":"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1","Type":"ContainerDied","Data":"9a0644deafab5ad319904ce6375da97734e9a6042a5e95858e563bc2a0a66275"} Nov 25 15:27:03 crc kubenswrapper[4965]: I1125 15:27:03.617661 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:27:06 crc kubenswrapper[4965]: I1125 15:27:06.806207 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:27:06 crc kubenswrapper[4965]: I1125 15:27:06.806245 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:27:07 crc kubenswrapper[4965]: I1125 15:27:07.810184 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:27:07 crc kubenswrapper[4965]: I1125 15:27:07.810259 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:27:08 crc kubenswrapper[4965]: E1125 15:27:08.233249 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Nov 25 15:27:08 crc kubenswrapper[4965]: E1125 15:27:08.233474 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-df8f2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(fbe835dc-8361-426b-931b-96a5f52d8743): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:27:08 crc kubenswrapper[4965]: E1125 15:27:08.235376 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.275629 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-754f5b77b5-wngzm" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.352496 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5584c797bd-g5v4b"] Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.352764 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5584c797bd-g5v4b" podUID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" containerName="neutron-api" containerID="cri-o://82d21146ff024b4028113639af6015b56a4ee884b1ded69f72a8701ac83bf0c5" gracePeriod=30 Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.353324 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5584c797bd-g5v4b" podUID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" containerName="neutron-httpd" containerID="cri-o://24f9340243084c7d5873a3f452bdbb9edea5f40fb75191930c2473bdee6e0a84" gracePeriod=30 Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.476832 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.572333 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-nb\") pod \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.572473 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-dns-svc\") pod \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.572497 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-config\") pod \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.572518 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgprz\" (UniqueName: \"kubernetes.io/projected/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-kube-api-access-jgprz\") pod \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.572590 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-sb\") pod \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\" (UID: \"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1\") " Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.598296 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-kube-api-access-jgprz" (OuterVolumeSpecName: "kube-api-access-jgprz") pod "db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" (UID: "db3b400e-15ef-4ffb-90cb-94d7d0f2bea1"). InnerVolumeSpecName "kube-api-access-jgprz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.667589 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" (UID: "db3b400e-15ef-4ffb-90cb-94d7d0f2bea1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.679489 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.679518 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgprz\" (UniqueName: \"kubernetes.io/projected/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-kube-api-access-jgprz\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.689537 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" (UID: "db3b400e-15ef-4ffb-90cb-94d7d0f2bea1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.784208 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.834831 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-config" (OuterVolumeSpecName: "config") pod "db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" (UID: "db3b400e-15ef-4ffb-90cb-94d7d0f2bea1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.835348 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" (UID: "db3b400e-15ef-4ffb-90cb-94d7d0f2bea1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.889512 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:08 crc kubenswrapper[4965]: I1125 15:27:08.889549 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.000582 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fdd878f78-r9vjx" event={"ID":"7ab61fa8-7198-48ff-920a-39246bf7d752","Type":"ContainerStarted","Data":"756dc5e2c0cf15bfccc743bcdef2da72889d2872dcde3925d970ec956e39fab1"} Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.000679 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.001318 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.007207 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fdd878f78-r9vjx" podUID="7ab61fa8-7198-48ff-920a-39246bf7d752" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.146:9311/healthcheck\": dial tcp 10.217.0.146:9311: connect: connection refused" Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.028408 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="ceilometer-central-agent" containerID="cri-o://d549e575ad2ce1815025007924d0f8471e010b51e11aa5da478797b2f70db9f5" gracePeriod=30 Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.028717 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.029277 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" event={"ID":"db3b400e-15ef-4ffb-90cb-94d7d0f2bea1","Type":"ContainerDied","Data":"4e03e0b982417b3e10599063bf30074f42148e627854dc5bb6c50aab4b5d621b"} Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.029368 4965 scope.go:117] "RemoveContainer" containerID="9a0644deafab5ad319904ce6375da97734e9a6042a5e95858e563bc2a0a66275" Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.029428 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="ceilometer-notification-agent" containerID="cri-o://c51e4db9685e7a431625e15a208bab8d5ee0f4920ccd03e2ff6584c25927f159" gracePeriod=30 Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.029476 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="sg-core" containerID="cri-o://86f6e2e46fc72758cfe5c6cf8334ddfece914ce3a9df52a023fe0e2f72460281" gracePeriod=30 Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.029943 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6fdd878f78-r9vjx" podStartSLOduration=18.029919642 podStartE2EDuration="18.029919642s" podCreationTimestamp="2025-11-25 15:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:27:09.029249734 +0000 UTC m=+1373.996843480" watchObservedRunningTime="2025-11-25 15:27:09.029919642 +0000 UTC m=+1373.997513388" Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.079468 4965 scope.go:117] "RemoveContainer" containerID="4eb91395aa46b41e81ae1bb77e9df4cf9202a68ef90745acfa8a62c39904043a" Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.228951 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bb44dcd7c-j7d8l"] Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.235626 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bb44dcd7c-j7d8l"] Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.893244 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:27:09 crc kubenswrapper[4965]: I1125 15:27:09.948661 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.054346 4965 generic.go:334] "Generic (PLEG): container finished" podID="fbe835dc-8361-426b-931b-96a5f52d8743" containerID="86f6e2e46fc72758cfe5c6cf8334ddfece914ce3a9df52a023fe0e2f72460281" exitCode=2 Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.054408 4965 generic.go:334] "Generic (PLEG): container finished" podID="fbe835dc-8361-426b-931b-96a5f52d8743" containerID="d549e575ad2ce1815025007924d0f8471e010b51e11aa5da478797b2f70db9f5" exitCode=0 Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.054460 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe835dc-8361-426b-931b-96a5f52d8743","Type":"ContainerDied","Data":"86f6e2e46fc72758cfe5c6cf8334ddfece914ce3a9df52a023fe0e2f72460281"} Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.054491 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe835dc-8361-426b-931b-96a5f52d8743","Type":"ContainerDied","Data":"d549e575ad2ce1815025007924d0f8471e010b51e11aa5da478797b2f70db9f5"} Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.059785 4965 generic.go:334] "Generic (PLEG): container finished" podID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" containerID="24f9340243084c7d5873a3f452bdbb9edea5f40fb75191930c2473bdee6e0a84" exitCode=0 Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.059860 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5584c797bd-g5v4b" event={"ID":"67c635f8-b3c2-49fe-b8f3-110550f9e86d","Type":"ContainerDied","Data":"24f9340243084c7d5873a3f452bdbb9edea5f40fb75191930c2473bdee6e0a84"} Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.064570 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b86cffbd7-spglj" event={"ID":"7960ffe1-ecc7-4e83-9255-8f57e8707289","Type":"ContainerStarted","Data":"8d652f44c1e1c128c82ef9c999cfe482810884010b7b37bb66a8bd67fff09c9b"} Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.064630 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b86cffbd7-spglj" event={"ID":"7960ffe1-ecc7-4e83-9255-8f57e8707289","Type":"ContainerStarted","Data":"249a444cdb296972017723c399b394d6c9225ff354de95e4d9de3c86204e548c"} Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.078960 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" event={"ID":"6a6c4bed-60b7-41ee-b4b0-412bb3e25989","Type":"ContainerStarted","Data":"5e708faaa2a9a438ed125a35d7dc2f0904302ab74bef67e7e061915ed14cf3c2"} Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.079021 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" event={"ID":"6a6c4bed-60b7-41ee-b4b0-412bb3e25989","Type":"ContainerStarted","Data":"c95c433cdce813894942520d50b45accbed05ab103d8f35b06b6d7e4a0525711"} Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.087616 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b86cffbd7-spglj" podStartSLOduration=3.816714976 podStartE2EDuration="24.087551167s" podCreationTimestamp="2025-11-25 15:26:46 +0000 UTC" firstStartedPulling="2025-11-25 15:26:47.931264235 +0000 UTC m=+1352.898857981" lastFinishedPulling="2025-11-25 15:27:08.202100386 +0000 UTC m=+1373.169694172" observedRunningTime="2025-11-25 15:27:10.08656232 +0000 UTC m=+1375.054156066" watchObservedRunningTime="2025-11-25 15:27:10.087551167 +0000 UTC m=+1375.055144913" Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.129453 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5bf85454d4-vsrlb" podStartSLOduration=4.104592929 podStartE2EDuration="24.129430026s" podCreationTimestamp="2025-11-25 15:26:46 +0000 UTC" firstStartedPulling="2025-11-25 15:26:48.19164993 +0000 UTC m=+1353.159243676" lastFinishedPulling="2025-11-25 15:27:08.216487007 +0000 UTC m=+1373.184080773" observedRunningTime="2025-11-25 15:27:10.101344763 +0000 UTC m=+1375.068938509" watchObservedRunningTime="2025-11-25 15:27:10.129430026 +0000 UTC m=+1375.097023772" Nov 25 15:27:10 crc kubenswrapper[4965]: I1125 15:27:10.783108 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" path="/var/lib/kubelet/pods/db3b400e-15ef-4ffb-90cb-94d7d0f2bea1/volumes" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.029746 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.096612 4965 generic.go:334] "Generic (PLEG): container finished" podID="fbe835dc-8361-426b-931b-96a5f52d8743" containerID="c51e4db9685e7a431625e15a208bab8d5ee0f4920ccd03e2ff6584c25927f159" exitCode=0 Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.097678 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe835dc-8361-426b-931b-96a5f52d8743","Type":"ContainerDied","Data":"c51e4db9685e7a431625e15a208bab8d5ee0f4920ccd03e2ff6584c25927f159"} Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.273164 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.394085 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-scripts\") pod \"fbe835dc-8361-426b-931b-96a5f52d8743\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.394184 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df8f2\" (UniqueName: \"kubernetes.io/projected/fbe835dc-8361-426b-931b-96a5f52d8743-kube-api-access-df8f2\") pod \"fbe835dc-8361-426b-931b-96a5f52d8743\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.394239 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-combined-ca-bundle\") pod \"fbe835dc-8361-426b-931b-96a5f52d8743\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.394297 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-run-httpd\") pod \"fbe835dc-8361-426b-931b-96a5f52d8743\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.394407 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-sg-core-conf-yaml\") pod \"fbe835dc-8361-426b-931b-96a5f52d8743\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.394476 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-config-data\") pod \"fbe835dc-8361-426b-931b-96a5f52d8743\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.394518 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-log-httpd\") pod \"fbe835dc-8361-426b-931b-96a5f52d8743\" (UID: \"fbe835dc-8361-426b-931b-96a5f52d8743\") " Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.395294 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fbe835dc-8361-426b-931b-96a5f52d8743" (UID: "fbe835dc-8361-426b-931b-96a5f52d8743"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.395371 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fbe835dc-8361-426b-931b-96a5f52d8743" (UID: "fbe835dc-8361-426b-931b-96a5f52d8743"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.395873 4965 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.395896 4965 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbe835dc-8361-426b-931b-96a5f52d8743-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.400229 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe835dc-8361-426b-931b-96a5f52d8743-kube-api-access-df8f2" (OuterVolumeSpecName: "kube-api-access-df8f2") pod "fbe835dc-8361-426b-931b-96a5f52d8743" (UID: "fbe835dc-8361-426b-931b-96a5f52d8743"). InnerVolumeSpecName "kube-api-access-df8f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.400832 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-scripts" (OuterVolumeSpecName: "scripts") pod "fbe835dc-8361-426b-931b-96a5f52d8743" (UID: "fbe835dc-8361-426b-931b-96a5f52d8743"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.423915 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fbe835dc-8361-426b-931b-96a5f52d8743" (UID: "fbe835dc-8361-426b-931b-96a5f52d8743"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.452090 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbe835dc-8361-426b-931b-96a5f52d8743" (UID: "fbe835dc-8361-426b-931b-96a5f52d8743"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.457585 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-config-data" (OuterVolumeSpecName: "config-data") pod "fbe835dc-8361-426b-931b-96a5f52d8743" (UID: "fbe835dc-8361-426b-931b-96a5f52d8743"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.497487 4965 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.497519 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.497528 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.497537 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df8f2\" (UniqueName: \"kubernetes.io/projected/fbe835dc-8361-426b-931b-96a5f52d8743-kube-api-access-df8f2\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.497548 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe835dc-8361-426b-931b-96a5f52d8743-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:12 crc kubenswrapper[4965]: I1125 15:27:12.783773 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bb44dcd7c-j7d8l" podUID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: i/o timeout" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.108370 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbe835dc-8361-426b-931b-96a5f52d8743","Type":"ContainerDied","Data":"24aa86de256f738185dbb129671078de0b213dc0b5e07176688cc0f1f1f707b5"} Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.108418 4965 scope.go:117] "RemoveContainer" containerID="86f6e2e46fc72758cfe5c6cf8334ddfece914ce3a9df52a023fe0e2f72460281" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.108572 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.137279 4965 scope.go:117] "RemoveContainer" containerID="c51e4db9685e7a431625e15a208bab8d5ee0f4920ccd03e2ff6584c25927f159" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.203839 4965 scope.go:117] "RemoveContainer" containerID="d549e575ad2ce1815025007924d0f8471e010b51e11aa5da478797b2f70db9f5" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.239239 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.256829 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.264955 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:13 crc kubenswrapper[4965]: E1125 15:27:13.265403 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="sg-core" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.265430 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="sg-core" Nov 25 15:27:13 crc kubenswrapper[4965]: E1125 15:27:13.265458 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" containerName="init" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.265475 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" containerName="init" Nov 25 15:27:13 crc kubenswrapper[4965]: E1125 15:27:13.265489 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" containerName="dnsmasq-dns" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.265498 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" containerName="dnsmasq-dns" Nov 25 15:27:13 crc kubenswrapper[4965]: E1125 15:27:13.265519 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="ceilometer-central-agent" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.265528 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="ceilometer-central-agent" Nov 25 15:27:13 crc kubenswrapper[4965]: E1125 15:27:13.265633 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="ceilometer-notification-agent" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.265645 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="ceilometer-notification-agent" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.265883 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="sg-core" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.265913 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="ceilometer-notification-agent" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.265930 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" containerName="ceilometer-central-agent" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.265942 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="db3b400e-15ef-4ffb-90cb-94d7d0f2bea1" containerName="dnsmasq-dns" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.267908 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.270464 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.271247 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.280577 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.434621 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.434690 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-run-httpd\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.434843 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-config-data\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.434889 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-log-httpd\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.434913 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.434937 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxgtv\" (UniqueName: \"kubernetes.io/projected/ec223010-25ed-49bf-a840-9b04638e8de4-kube-api-access-kxgtv\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.434957 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-scripts\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.536257 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-config-data\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.536318 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-log-httpd\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.536338 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.536358 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxgtv\" (UniqueName: \"kubernetes.io/projected/ec223010-25ed-49bf-a840-9b04638e8de4-kube-api-access-kxgtv\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.536374 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-scripts\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.536425 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.536448 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-run-httpd\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.537004 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-log-httpd\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.537023 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-run-httpd\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.543309 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.543729 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-scripts\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.552875 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.552310 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-config-data\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.558180 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxgtv\" (UniqueName: \"kubernetes.io/projected/ec223010-25ed-49bf-a840-9b04638e8de4-kube-api-access-kxgtv\") pod \"ceilometer-0\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " pod="openstack/ceilometer-0" Nov 25 15:27:13 crc kubenswrapper[4965]: I1125 15:27:13.591559 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.103346 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.107053 4965 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.139339 4965 generic.go:334] "Generic (PLEG): container finished" podID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" containerID="82d21146ff024b4028113639af6015b56a4ee884b1ded69f72a8701ac83bf0c5" exitCode=0 Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.139396 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5584c797bd-g5v4b" event={"ID":"67c635f8-b3c2-49fe-b8f3-110550f9e86d","Type":"ContainerDied","Data":"82d21146ff024b4028113639af6015b56a4ee884b1ded69f72a8701ac83bf0c5"} Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.140870 4965 generic.go:334] "Generic (PLEG): container finished" podID="78768f1b-d9d5-4124-8fe5-bc4b357605ca" containerID="a1966663a4a8faa990a6cfe8007baafbbf17a1beca1be322056816ffbc5a8efd" exitCode=0 Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.140911 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lmtqd" event={"ID":"78768f1b-d9d5-4124-8fe5-bc4b357605ca","Type":"ContainerDied","Data":"a1966663a4a8faa990a6cfe8007baafbbf17a1beca1be322056816ffbc5a8efd"} Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.142595 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec223010-25ed-49bf-a840-9b04638e8de4","Type":"ContainerStarted","Data":"efe6ed093ce42c4b83e4162b74322ba0d2db21415a744f1e43f69db5e881bf39"} Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.402861 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.552550 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-config\") pod \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.552695 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-ovndb-tls-certs\") pod \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.552831 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-httpd-config\") pod \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.552881 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m2fl\" (UniqueName: \"kubernetes.io/projected/67c635f8-b3c2-49fe-b8f3-110550f9e86d-kube-api-access-2m2fl\") pod \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.552910 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-combined-ca-bundle\") pod \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\" (UID: \"67c635f8-b3c2-49fe-b8f3-110550f9e86d\") " Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.562140 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "67c635f8-b3c2-49fe-b8f3-110550f9e86d" (UID: "67c635f8-b3c2-49fe-b8f3-110550f9e86d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.562193 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c635f8-b3c2-49fe-b8f3-110550f9e86d-kube-api-access-2m2fl" (OuterVolumeSpecName: "kube-api-access-2m2fl") pod "67c635f8-b3c2-49fe-b8f3-110550f9e86d" (UID: "67c635f8-b3c2-49fe-b8f3-110550f9e86d"). InnerVolumeSpecName "kube-api-access-2m2fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.612151 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67c635f8-b3c2-49fe-b8f3-110550f9e86d" (UID: "67c635f8-b3c2-49fe-b8f3-110550f9e86d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.621638 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-config" (OuterVolumeSpecName: "config") pod "67c635f8-b3c2-49fe-b8f3-110550f9e86d" (UID: "67c635f8-b3c2-49fe-b8f3-110550f9e86d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.648805 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "67c635f8-b3c2-49fe-b8f3-110550f9e86d" (UID: "67c635f8-b3c2-49fe-b8f3-110550f9e86d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.654600 4965 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.654782 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m2fl\" (UniqueName: \"kubernetes.io/projected/67c635f8-b3c2-49fe-b8f3-110550f9e86d-kube-api-access-2m2fl\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.654845 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.654901 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.654951 4965 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c635f8-b3c2-49fe-b8f3-110550f9e86d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:14 crc kubenswrapper[4965]: I1125 15:27:14.786994 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe835dc-8361-426b-931b-96a5f52d8743" path="/var/lib/kubelet/pods/fbe835dc-8361-426b-931b-96a5f52d8743/volumes" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.154926 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5584c797bd-g5v4b" event={"ID":"67c635f8-b3c2-49fe-b8f3-110550f9e86d","Type":"ContainerDied","Data":"c19dc1290439cf0df01092ace1c49007a3439744671a1aa8bece360a66fe2264"} Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.154993 4965 scope.go:117] "RemoveContainer" containerID="24f9340243084c7d5873a3f452bdbb9edea5f40fb75191930c2473bdee6e0a84" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.155733 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5584c797bd-g5v4b" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.161533 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec223010-25ed-49bf-a840-9b04638e8de4","Type":"ContainerStarted","Data":"44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a"} Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.189865 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5584c797bd-g5v4b"] Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.197501 4965 scope.go:117] "RemoveContainer" containerID="82d21146ff024b4028113639af6015b56a4ee884b1ded69f72a8701ac83bf0c5" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.201721 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5584c797bd-g5v4b"] Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.466339 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.568857 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78768f1b-d9d5-4124-8fe5-bc4b357605ca-etc-machine-id\") pod \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.569054 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78768f1b-d9d5-4124-8fe5-bc4b357605ca-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "78768f1b-d9d5-4124-8fe5-bc4b357605ca" (UID: "78768f1b-d9d5-4124-8fe5-bc4b357605ca"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.569105 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-scripts\") pod \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.569137 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kdc4\" (UniqueName: \"kubernetes.io/projected/78768f1b-d9d5-4124-8fe5-bc4b357605ca-kube-api-access-4kdc4\") pod \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.569168 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-config-data\") pod \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.569198 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-combined-ca-bundle\") pod \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.569266 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-db-sync-config-data\") pod \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\" (UID: \"78768f1b-d9d5-4124-8fe5-bc4b357605ca\") " Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.569711 4965 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78768f1b-d9d5-4124-8fe5-bc4b357605ca-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.575259 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "78768f1b-d9d5-4124-8fe5-bc4b357605ca" (UID: "78768f1b-d9d5-4124-8fe5-bc4b357605ca"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.576318 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-scripts" (OuterVolumeSpecName: "scripts") pod "78768f1b-d9d5-4124-8fe5-bc4b357605ca" (UID: "78768f1b-d9d5-4124-8fe5-bc4b357605ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.577189 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78768f1b-d9d5-4124-8fe5-bc4b357605ca-kube-api-access-4kdc4" (OuterVolumeSpecName: "kube-api-access-4kdc4") pod "78768f1b-d9d5-4124-8fe5-bc4b357605ca" (UID: "78768f1b-d9d5-4124-8fe5-bc4b357605ca"). InnerVolumeSpecName "kube-api-access-4kdc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.600545 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78768f1b-d9d5-4124-8fe5-bc4b357605ca" (UID: "78768f1b-d9d5-4124-8fe5-bc4b357605ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.625165 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-config-data" (OuterVolumeSpecName: "config-data") pod "78768f1b-d9d5-4124-8fe5-bc4b357605ca" (UID: "78768f1b-d9d5-4124-8fe5-bc4b357605ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.671333 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.671369 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kdc4\" (UniqueName: \"kubernetes.io/projected/78768f1b-d9d5-4124-8fe5-bc4b357605ca-kube-api-access-4kdc4\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.671380 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.671392 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:15 crc kubenswrapper[4965]: I1125 15:27:15.671400 4965 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/78768f1b-d9d5-4124-8fe5-bc4b357605ca-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.170401 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec223010-25ed-49bf-a840-9b04638e8de4","Type":"ContainerStarted","Data":"e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c"} Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.173374 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lmtqd" event={"ID":"78768f1b-d9d5-4124-8fe5-bc4b357605ca","Type":"ContainerDied","Data":"c21fb1973289bc2c6c344e7b6d96181c76c7486ea9b3b91c629d41271e968439"} Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.173408 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c21fb1973289bc2c6c344e7b6d96181c76c7486ea9b3b91c629d41271e968439" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.173460 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lmtqd" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.497543 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 15:27:16 crc kubenswrapper[4965]: E1125 15:27:16.497928 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78768f1b-d9d5-4124-8fe5-bc4b357605ca" containerName="cinder-db-sync" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.497946 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="78768f1b-d9d5-4124-8fe5-bc4b357605ca" containerName="cinder-db-sync" Nov 25 15:27:16 crc kubenswrapper[4965]: E1125 15:27:16.497962 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" containerName="neutron-api" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.497992 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" containerName="neutron-api" Nov 25 15:27:16 crc kubenswrapper[4965]: E1125 15:27:16.498013 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" containerName="neutron-httpd" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.498019 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" containerName="neutron-httpd" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.498175 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" containerName="neutron-api" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.498194 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="78768f1b-d9d5-4124-8fe5-bc4b357605ca" containerName="cinder-db-sync" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.498209 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" containerName="neutron-httpd" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.499890 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.505091 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.505582 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-scrvm" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.510034 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.510211 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.538338 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.586873 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65d9bb97-2095-4f82-81e9-2fb5fc578079-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.586941 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.587016 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c6tc\" (UniqueName: \"kubernetes.io/projected/65d9bb97-2095-4f82-81e9-2fb5fc578079-kube-api-access-4c6tc\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.602000 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.602161 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-scripts\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.602189 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.666055 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b99c4f867-9xrlr"] Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.667481 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.706064 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.706133 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.706152 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-scripts\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.706194 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65d9bb97-2095-4f82-81e9-2fb5fc578079-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.706212 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.706240 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c6tc\" (UniqueName: \"kubernetes.io/projected/65d9bb97-2095-4f82-81e9-2fb5fc578079-kube-api-access-4c6tc\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.706837 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65d9bb97-2095-4f82-81e9-2fb5fc578079-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.711363 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.711467 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.714017 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b99c4f867-9xrlr"] Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.715354 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.721629 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-scripts\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.721685 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.731860 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.745093 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.746553 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.748491 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.748889 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.761956 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fdd878f78-r9vjx" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.799127 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c6tc\" (UniqueName: \"kubernetes.io/projected/65d9bb97-2095-4f82-81e9-2fb5fc578079-kube-api-access-4c6tc\") pod \"cinder-scheduler-0\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.807676 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-dns-svc\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.807732 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-nb\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.807785 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.807801 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfef716b-a6c5-4237-91b5-0078ce32f42d-logs\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.807887 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-scripts\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.807935 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.807954 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.807993 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-sb\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.808009 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfef716b-a6c5-4237-91b5-0078ce32f42d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.808025 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qrlv\" (UniqueName: \"kubernetes.io/projected/067a3029-be3c-486c-bf8a-795d8f2e55f8-kube-api-access-4qrlv\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.808043 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhzjz\" (UniqueName: \"kubernetes.io/projected/bfef716b-a6c5-4237-91b5-0078ce32f42d-kube-api-access-rhzjz\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.808056 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-config\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.826799 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-scrvm" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.832166 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.850719 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c635f8-b3c2-49fe-b8f3-110550f9e86d" path="/var/lib/kubelet/pods/67c635f8-b3c2-49fe-b8f3-110550f9e86d/volumes" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.851458 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911135 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911173 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911223 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-sb\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911241 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfef716b-a6c5-4237-91b5-0078ce32f42d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911275 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qrlv\" (UniqueName: \"kubernetes.io/projected/067a3029-be3c-486c-bf8a-795d8f2e55f8-kube-api-access-4qrlv\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911296 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhzjz\" (UniqueName: \"kubernetes.io/projected/bfef716b-a6c5-4237-91b5-0078ce32f42d-kube-api-access-rhzjz\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911312 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-config\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911347 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-dns-svc\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911379 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-nb\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911412 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911430 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfef716b-a6c5-4237-91b5-0078ce32f42d-logs\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.911525 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-scripts\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.913445 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-config\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.914319 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfef716b-a6c5-4237-91b5-0078ce32f42d-logs\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.914853 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-dns-svc\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.915368 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-nb\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.916145 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-sb\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.916191 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfef716b-a6c5-4237-91b5-0078ce32f42d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.935086 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.935584 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.935931 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.936460 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-scripts\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.954613 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhzjz\" (UniqueName: \"kubernetes.io/projected/bfef716b-a6c5-4237-91b5-0078ce32f42d-kube-api-access-rhzjz\") pod \"cinder-api-0\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " pod="openstack/cinder-api-0" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.961703 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qrlv\" (UniqueName: \"kubernetes.io/projected/067a3029-be3c-486c-bf8a-795d8f2e55f8-kube-api-access-4qrlv\") pod \"dnsmasq-dns-b99c4f867-9xrlr\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.964764 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-759b699686-scwl8"] Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.964951 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api-log" containerID="cri-o://e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771" gracePeriod=30 Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.965123 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api" containerID="cri-o://09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464" gracePeriod=30 Nov 25 15:27:16 crc kubenswrapper[4965]: I1125 15:27:16.997383 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:17 crc kubenswrapper[4965]: I1125 15:27:17.212859 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 15:27:17 crc kubenswrapper[4965]: I1125 15:27:17.238298 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec223010-25ed-49bf-a840-9b04638e8de4","Type":"ContainerStarted","Data":"7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047"} Nov 25 15:27:17 crc kubenswrapper[4965]: I1125 15:27:17.267745 4965 generic.go:334] "Generic (PLEG): container finished" podID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerID="e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771" exitCode=143 Nov 25 15:27:17 crc kubenswrapper[4965]: I1125 15:27:17.267780 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b699686-scwl8" event={"ID":"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb","Type":"ContainerDied","Data":"e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771"} Nov 25 15:27:17 crc kubenswrapper[4965]: I1125 15:27:17.679363 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b99c4f867-9xrlr"] Nov 25 15:27:17 crc kubenswrapper[4965]: W1125 15:27:17.687481 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod067a3029_be3c_486c_bf8a_795d8f2e55f8.slice/crio-1c0e6ab26858f3a506b273d87fd744dd8b366669841ceefa377d7d000bcc8afa WatchSource:0}: Error finding container 1c0e6ab26858f3a506b273d87fd744dd8b366669841ceefa377d7d000bcc8afa: Status 404 returned error can't find the container with id 1c0e6ab26858f3a506b273d87fd744dd8b366669841ceefa377d7d000bcc8afa Nov 25 15:27:17 crc kubenswrapper[4965]: I1125 15:27:17.693082 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 15:27:17 crc kubenswrapper[4965]: I1125 15:27:17.873982 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 15:27:18 crc kubenswrapper[4965]: I1125 15:27:18.285496 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"65d9bb97-2095-4f82-81e9-2fb5fc578079","Type":"ContainerStarted","Data":"37bcf8dd614b1bc2bd9e1e53ae696c7835e9296a156b04ef363a61d39cc66e6c"} Nov 25 15:27:18 crc kubenswrapper[4965]: I1125 15:27:18.287153 4965 generic.go:334] "Generic (PLEG): container finished" podID="067a3029-be3c-486c-bf8a-795d8f2e55f8" containerID="4b46c71a0c77b619168e9701e44bbb46ea366b0301c87902b89146b4954f9a74" exitCode=0 Nov 25 15:27:18 crc kubenswrapper[4965]: I1125 15:27:18.287290 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" event={"ID":"067a3029-be3c-486c-bf8a-795d8f2e55f8","Type":"ContainerDied","Data":"4b46c71a0c77b619168e9701e44bbb46ea366b0301c87902b89146b4954f9a74"} Nov 25 15:27:18 crc kubenswrapper[4965]: I1125 15:27:18.287340 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" event={"ID":"067a3029-be3c-486c-bf8a-795d8f2e55f8","Type":"ContainerStarted","Data":"1c0e6ab26858f3a506b273d87fd744dd8b366669841ceefa377d7d000bcc8afa"} Nov 25 15:27:18 crc kubenswrapper[4965]: I1125 15:27:18.289572 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfef716b-a6c5-4237-91b5-0078ce32f42d","Type":"ContainerStarted","Data":"de561b464461acfd0dcad1947da3ff12fd7f99a08649be49b08b0f6ee382bf53"} Nov 25 15:27:18 crc kubenswrapper[4965]: I1125 15:27:18.940318 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 25 15:27:19 crc kubenswrapper[4965]: I1125 15:27:19.375421 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfef716b-a6c5-4237-91b5-0078ce32f42d","Type":"ContainerStarted","Data":"fe7609d42fea400dcd0e23317a4192bac803d9f9ea080c81804eb10081e4b1a8"} Nov 25 15:27:19 crc kubenswrapper[4965]: I1125 15:27:19.393343 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" event={"ID":"067a3029-be3c-486c-bf8a-795d8f2e55f8","Type":"ContainerStarted","Data":"696e15e409d7de6b543b64e45d2f2d57a461493565bd00c4432ed544fe3cdec6"} Nov 25 15:27:19 crc kubenswrapper[4965]: I1125 15:27:19.393640 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:19 crc kubenswrapper[4965]: I1125 15:27:19.468490 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" podStartSLOduration=3.468474915 podStartE2EDuration="3.468474915s" podCreationTimestamp="2025-11-25 15:27:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:27:19.455353379 +0000 UTC m=+1384.422947125" watchObservedRunningTime="2025-11-25 15:27:19.468474915 +0000 UTC m=+1384.436068661" Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.398067 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.431653 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"65d9bb97-2095-4f82-81e9-2fb5fc578079","Type":"ContainerStarted","Data":"525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7"} Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.439577 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec223010-25ed-49bf-a840-9b04638e8de4","Type":"ContainerStarted","Data":"a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3"} Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.440996 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.444655 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55bb9cdb94-946lv" Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.450408 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfef716b-a6c5-4237-91b5-0078ce32f42d","Type":"ContainerStarted","Data":"5fcc3cacc4c27424c7b4880613b33f76e42f1e22acada5c73c1b4a1e28b9a926"} Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.451305 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bfef716b-a6c5-4237-91b5-0078ce32f42d" containerName="cinder-api-log" containerID="cri-o://fe7609d42fea400dcd0e23317a4192bac803d9f9ea080c81804eb10081e4b1a8" gracePeriod=30 Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.451432 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bfef716b-a6c5-4237-91b5-0078ce32f42d" containerName="cinder-api" containerID="cri-o://5fcc3cacc4c27424c7b4880613b33f76e42f1e22acada5c73c1b4a1e28b9a926" gracePeriod=30 Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.490036 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.42497238 podStartE2EDuration="7.490019579s" podCreationTimestamp="2025-11-25 15:27:13 +0000 UTC" firstStartedPulling="2025-11-25 15:27:14.106828153 +0000 UTC m=+1379.074421899" lastFinishedPulling="2025-11-25 15:27:19.171875352 +0000 UTC m=+1384.139469098" observedRunningTime="2025-11-25 15:27:20.474991751 +0000 UTC m=+1385.442585497" watchObservedRunningTime="2025-11-25 15:27:20.490019579 +0000 UTC m=+1385.457613325" Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.529726 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": read tcp 10.217.0.2:58704->10.217.0.144:9311: read: connection reset by peer" Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.529727 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759b699686-scwl8" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": read tcp 10.217.0.2:58716->10.217.0.144:9311: read: connection reset by peer" Nov 25 15:27:20 crc kubenswrapper[4965]: I1125 15:27:20.546119 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.546102373 podStartE2EDuration="4.546102373s" podCreationTimestamp="2025-11-25 15:27:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:27:20.51618366 +0000 UTC m=+1385.483777406" watchObservedRunningTime="2025-11-25 15:27:20.546102373 +0000 UTC m=+1385.513696109" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.061915 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.102078 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data-custom\") pod \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.102166 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data\") pod \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.102347 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-combined-ca-bundle\") pod \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.102383 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-logs\") pod \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.102426 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qqcl\" (UniqueName: \"kubernetes.io/projected/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-kube-api-access-9qqcl\") pod \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\" (UID: \"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb\") " Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.105040 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-logs" (OuterVolumeSpecName: "logs") pod "3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" (UID: "3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.112086 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-kube-api-access-9qqcl" (OuterVolumeSpecName: "kube-api-access-9qqcl") pod "3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" (UID: "3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb"). InnerVolumeSpecName "kube-api-access-9qqcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.114646 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" (UID: "3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.151069 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" (UID: "3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.179152 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data" (OuterVolumeSpecName: "config-data") pod "3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" (UID: "3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.204280 4965 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.204452 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.204533 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.204586 4965 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.204635 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qqcl\" (UniqueName: \"kubernetes.io/projected/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb-kube-api-access-9qqcl\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.459812 4965 generic.go:334] "Generic (PLEG): container finished" podID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerID="09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464" exitCode=0 Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.460169 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b699686-scwl8" event={"ID":"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb","Type":"ContainerDied","Data":"09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464"} Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.460196 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b699686-scwl8" event={"ID":"3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb","Type":"ContainerDied","Data":"ef410f9d753f428937f00f6ae24c21e583e174550fbb345322fddcd0df4fe680"} Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.460213 4965 scope.go:117] "RemoveContainer" containerID="09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.460316 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759b699686-scwl8" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.477332 4965 generic.go:334] "Generic (PLEG): container finished" podID="bfef716b-a6c5-4237-91b5-0078ce32f42d" containerID="fe7609d42fea400dcd0e23317a4192bac803d9f9ea080c81804eb10081e4b1a8" exitCode=143 Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.477368 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfef716b-a6c5-4237-91b5-0078ce32f42d","Type":"ContainerDied","Data":"fe7609d42fea400dcd0e23317a4192bac803d9f9ea080c81804eb10081e4b1a8"} Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.483507 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"65d9bb97-2095-4f82-81e9-2fb5fc578079","Type":"ContainerStarted","Data":"93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb"} Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.496950 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-759b699686-scwl8"] Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.506292 4965 scope.go:117] "RemoveContainer" containerID="e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.517636 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-759b699686-scwl8"] Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.519538 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.6767902279999998 podStartE2EDuration="5.519526929s" podCreationTimestamp="2025-11-25 15:27:16 +0000 UTC" firstStartedPulling="2025-11-25 15:27:17.7024473 +0000 UTC m=+1382.670041046" lastFinishedPulling="2025-11-25 15:27:19.545183991 +0000 UTC m=+1384.512777747" observedRunningTime="2025-11-25 15:27:21.516419015 +0000 UTC m=+1386.484012761" watchObservedRunningTime="2025-11-25 15:27:21.519526929 +0000 UTC m=+1386.487120675" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.528431 4965 scope.go:117] "RemoveContainer" containerID="09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464" Nov 25 15:27:21 crc kubenswrapper[4965]: E1125 15:27:21.528958 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464\": container with ID starting with 09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464 not found: ID does not exist" containerID="09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.529061 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464"} err="failed to get container status \"09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464\": rpc error: code = NotFound desc = could not find container \"09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464\": container with ID starting with 09e6e4452546a99fe5396743bf9f13f8d1633a3d385a1ae57e80ae018ab49464 not found: ID does not exist" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.529209 4965 scope.go:117] "RemoveContainer" containerID="e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771" Nov 25 15:27:21 crc kubenswrapper[4965]: E1125 15:27:21.529532 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771\": container with ID starting with e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771 not found: ID does not exist" containerID="e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.529617 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771"} err="failed to get container status \"e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771\": rpc error: code = NotFound desc = could not find container \"e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771\": container with ID starting with e183a110e51ef565487d7fb3682ea86da3d2f38b8b3e153ce53dac9a2cf54771 not found: ID does not exist" Nov 25 15:27:21 crc kubenswrapper[4965]: I1125 15:27:21.833929 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 25 15:27:22 crc kubenswrapper[4965]: I1125 15:27:22.214449 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 25 15:27:22 crc kubenswrapper[4965]: I1125 15:27:22.495053 4965 generic.go:334] "Generic (PLEG): container finished" podID="0546399d-f1ee-4fe8-aa16-fb64e9f58899" containerID="7d10150452146b84ff4b8b901f36e5add7cc84ac487cefe12062b670af70c4ef" exitCode=0 Nov 25 15:27:22 crc kubenswrapper[4965]: I1125 15:27:22.495182 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cr787" event={"ID":"0546399d-f1ee-4fe8-aa16-fb64e9f58899","Type":"ContainerDied","Data":"7d10150452146b84ff4b8b901f36e5add7cc84ac487cefe12062b670af70c4ef"} Nov 25 15:27:22 crc kubenswrapper[4965]: I1125 15:27:22.782189 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" path="/var/lib/kubelet/pods/3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb/volumes" Nov 25 15:27:23 crc kubenswrapper[4965]: I1125 15:27:23.260448 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:27:23 crc kubenswrapper[4965]: I1125 15:27:23.260919 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:27:23 crc kubenswrapper[4965]: I1125 15:27:23.261031 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:27:23 crc kubenswrapper[4965]: I1125 15:27:23.266513 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0c4deae36fbd6888b83491cb53bd4ad9a4b3cad48a12bfa6331042ee58854cf"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:27:23 crc kubenswrapper[4965]: I1125 15:27:23.266658 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://a0c4deae36fbd6888b83491cb53bd4ad9a4b3cad48a12bfa6331042ee58854cf" gracePeriod=600 Nov 25 15:27:23 crc kubenswrapper[4965]: I1125 15:27:23.508354 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="a0c4deae36fbd6888b83491cb53bd4ad9a4b3cad48a12bfa6331042ee58854cf" exitCode=0 Nov 25 15:27:23 crc kubenswrapper[4965]: I1125 15:27:23.509273 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"a0c4deae36fbd6888b83491cb53bd4ad9a4b3cad48a12bfa6331042ee58854cf"} Nov 25 15:27:23 crc kubenswrapper[4965]: I1125 15:27:23.509314 4965 scope.go:117] "RemoveContainer" containerID="be8cabf8c298dce6dc5c47e109690923bbdb10ab8f0bdbfa1738209ba0e27a1b" Nov 25 15:27:23 crc kubenswrapper[4965]: I1125 15:27:23.938953 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cr787" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.054087 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-config-data\") pod \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.054204 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-db-sync-config-data\") pod \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.054260 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-combined-ca-bundle\") pod \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.054341 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll9bz\" (UniqueName: \"kubernetes.io/projected/0546399d-f1ee-4fe8-aa16-fb64e9f58899-kube-api-access-ll9bz\") pod \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\" (UID: \"0546399d-f1ee-4fe8-aa16-fb64e9f58899\") " Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.064105 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0546399d-f1ee-4fe8-aa16-fb64e9f58899-kube-api-access-ll9bz" (OuterVolumeSpecName: "kube-api-access-ll9bz") pod "0546399d-f1ee-4fe8-aa16-fb64e9f58899" (UID: "0546399d-f1ee-4fe8-aa16-fb64e9f58899"). InnerVolumeSpecName "kube-api-access-ll9bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.076172 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0546399d-f1ee-4fe8-aa16-fb64e9f58899" (UID: "0546399d-f1ee-4fe8-aa16-fb64e9f58899"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.091060 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0546399d-f1ee-4fe8-aa16-fb64e9f58899" (UID: "0546399d-f1ee-4fe8-aa16-fb64e9f58899"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.110455 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-config-data" (OuterVolumeSpecName: "config-data") pod "0546399d-f1ee-4fe8-aa16-fb64e9f58899" (UID: "0546399d-f1ee-4fe8-aa16-fb64e9f58899"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.156523 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.156564 4965 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.156579 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0546399d-f1ee-4fe8-aa16-fb64e9f58899-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.156594 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll9bz\" (UniqueName: \"kubernetes.io/projected/0546399d-f1ee-4fe8-aa16-fb64e9f58899-kube-api-access-ll9bz\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.516545 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6b6574b966-rvvgn" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.518928 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab"} Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.520789 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cr787" event={"ID":"0546399d-f1ee-4fe8-aa16-fb64e9f58899","Type":"ContainerDied","Data":"2a1f7e9d4add1068920f47dcc3551e0dae6153a31a35080f71a2a2d709260ab9"} Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.520828 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a1f7e9d4add1068920f47dcc3551e0dae6153a31a35080f71a2a2d709260ab9" Nov 25 15:27:24 crc kubenswrapper[4965]: I1125 15:27:24.520837 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cr787" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.010584 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b99c4f867-9xrlr"] Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.011385 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" podUID="067a3029-be3c-486c-bf8a-795d8f2e55f8" containerName="dnsmasq-dns" containerID="cri-o://696e15e409d7de6b543b64e45d2f2d57a461493565bd00c4432ed544fe3cdec6" gracePeriod=10 Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.013303 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.050832 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-d6mx8"] Nov 25 15:27:25 crc kubenswrapper[4965]: E1125 15:27:25.051222 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0546399d-f1ee-4fe8-aa16-fb64e9f58899" containerName="glance-db-sync" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.051238 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="0546399d-f1ee-4fe8-aa16-fb64e9f58899" containerName="glance-db-sync" Nov 25 15:27:25 crc kubenswrapper[4965]: E1125 15:27:25.051254 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.051261 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api" Nov 25 15:27:25 crc kubenswrapper[4965]: E1125 15:27:25.051271 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api-log" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.051277 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api-log" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.051455 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.051472 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee52a3a-c2d2-46e2-9a6d-f4fc577ebceb" containerName="barbican-api-log" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.051481 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="0546399d-f1ee-4fe8-aa16-fb64e9f58899" containerName="glance-db-sync" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.052357 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.079576 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.079621 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.079641 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b24b8\" (UniqueName: \"kubernetes.io/projected/bc935d6a-b651-4c29-9d70-5b9abc6c8580-kube-api-access-b24b8\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.079672 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.079719 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-config\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.096637 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-d6mx8"] Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.187151 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.187507 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.187536 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b24b8\" (UniqueName: \"kubernetes.io/projected/bc935d6a-b651-4c29-9d70-5b9abc6c8580-kube-api-access-b24b8\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.187583 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.187656 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-config\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.188693 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-config\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.189678 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.193647 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.201025 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.215989 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b24b8\" (UniqueName: \"kubernetes.io/projected/bc935d6a-b651-4c29-9d70-5b9abc6c8580-kube-api-access-b24b8\") pod \"dnsmasq-dns-6d97fcdd8f-d6mx8\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.399950 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.535876 4965 generic.go:334] "Generic (PLEG): container finished" podID="067a3029-be3c-486c-bf8a-795d8f2e55f8" containerID="696e15e409d7de6b543b64e45d2f2d57a461493565bd00c4432ed544fe3cdec6" exitCode=0 Nov 25 15:27:25 crc kubenswrapper[4965]: I1125 15:27:25.536630 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" event={"ID":"067a3029-be3c-486c-bf8a-795d8f2e55f8","Type":"ContainerDied","Data":"696e15e409d7de6b543b64e45d2f2d57a461493565bd00c4432ed544fe3cdec6"} Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.039981 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-d6mx8"] Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.196102 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.309024 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-nb\") pod \"067a3029-be3c-486c-bf8a-795d8f2e55f8\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.309084 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qrlv\" (UniqueName: \"kubernetes.io/projected/067a3029-be3c-486c-bf8a-795d8f2e55f8-kube-api-access-4qrlv\") pod \"067a3029-be3c-486c-bf8a-795d8f2e55f8\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.309121 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-dns-svc\") pod \"067a3029-be3c-486c-bf8a-795d8f2e55f8\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.309223 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-sb\") pod \"067a3029-be3c-486c-bf8a-795d8f2e55f8\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.309255 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-config\") pod \"067a3029-be3c-486c-bf8a-795d8f2e55f8\" (UID: \"067a3029-be3c-486c-bf8a-795d8f2e55f8\") " Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.318289 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067a3029-be3c-486c-bf8a-795d8f2e55f8-kube-api-access-4qrlv" (OuterVolumeSpecName: "kube-api-access-4qrlv") pod "067a3029-be3c-486c-bf8a-795d8f2e55f8" (UID: "067a3029-be3c-486c-bf8a-795d8f2e55f8"). InnerVolumeSpecName "kube-api-access-4qrlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.390386 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "067a3029-be3c-486c-bf8a-795d8f2e55f8" (UID: "067a3029-be3c-486c-bf8a-795d8f2e55f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.392457 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-config" (OuterVolumeSpecName: "config") pod "067a3029-be3c-486c-bf8a-795d8f2e55f8" (UID: "067a3029-be3c-486c-bf8a-795d8f2e55f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.398402 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "067a3029-be3c-486c-bf8a-795d8f2e55f8" (UID: "067a3029-be3c-486c-bf8a-795d8f2e55f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.407786 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "067a3029-be3c-486c-bf8a-795d8f2e55f8" (UID: "067a3029-be3c-486c-bf8a-795d8f2e55f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.411316 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.411412 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qrlv\" (UniqueName: \"kubernetes.io/projected/067a3029-be3c-486c-bf8a-795d8f2e55f8-kube-api-access-4qrlv\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.411470 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.411520 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.411573 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067a3029-be3c-486c-bf8a-795d8f2e55f8-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.544544 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" event={"ID":"067a3029-be3c-486c-bf8a-795d8f2e55f8","Type":"ContainerDied","Data":"1c0e6ab26858f3a506b273d87fd744dd8b366669841ceefa377d7d000bcc8afa"} Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.545578 4965 scope.go:117] "RemoveContainer" containerID="696e15e409d7de6b543b64e45d2f2d57a461493565bd00c4432ed544fe3cdec6" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.544741 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b99c4f867-9xrlr" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.546622 4965 generic.go:334] "Generic (PLEG): container finished" podID="bc935d6a-b651-4c29-9d70-5b9abc6c8580" containerID="6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb" exitCode=0 Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.546653 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" event={"ID":"bc935d6a-b651-4c29-9d70-5b9abc6c8580","Type":"ContainerDied","Data":"6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb"} Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.546672 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" event={"ID":"bc935d6a-b651-4c29-9d70-5b9abc6c8580","Type":"ContainerStarted","Data":"72a8e390831b36bdb794d23575d36f78b3b90de66e8047ced30e5d4659aac04e"} Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.696590 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b99c4f867-9xrlr"] Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.696635 4965 scope.go:117] "RemoveContainer" containerID="4b46c71a0c77b619168e9701e44bbb46ea366b0301c87902b89146b4954f9a74" Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.710744 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b99c4f867-9xrlr"] Nov 25 15:27:26 crc kubenswrapper[4965]: I1125 15:27:26.790869 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067a3029-be3c-486c-bf8a-795d8f2e55f8" path="/var/lib/kubelet/pods/067a3029-be3c-486c-bf8a-795d8f2e55f8/volumes" Nov 25 15:27:27 crc kubenswrapper[4965]: I1125 15:27:27.084988 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 25 15:27:27 crc kubenswrapper[4965]: I1125 15:27:27.164350 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 15:27:27 crc kubenswrapper[4965]: I1125 15:27:27.583192 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" event={"ID":"bc935d6a-b651-4c29-9d70-5b9abc6c8580","Type":"ContainerStarted","Data":"528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0"} Nov 25 15:27:27 crc kubenswrapper[4965]: I1125 15:27:27.583318 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="65d9bb97-2095-4f82-81e9-2fb5fc578079" containerName="cinder-scheduler" containerID="cri-o://525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7" gracePeriod=30 Nov 25 15:27:27 crc kubenswrapper[4965]: I1125 15:27:27.583372 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="65d9bb97-2095-4f82-81e9-2fb5fc578079" containerName="probe" containerID="cri-o://93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb" gracePeriod=30 Nov 25 15:27:27 crc kubenswrapper[4965]: I1125 15:27:27.619482 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" podStartSLOduration=2.619467004 podStartE2EDuration="2.619467004s" podCreationTimestamp="2025-11-25 15:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:27:27.617475969 +0000 UTC m=+1392.585069715" watchObservedRunningTime="2025-11-25 15:27:27.619467004 +0000 UTC m=+1392.587060750" Nov 25 15:27:28 crc kubenswrapper[4965]: I1125 15:27:28.590431 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.024145 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 25 15:27:29 crc kubenswrapper[4965]: E1125 15:27:29.024868 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067a3029-be3c-486c-bf8a-795d8f2e55f8" containerName="dnsmasq-dns" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.024891 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="067a3029-be3c-486c-bf8a-795d8f2e55f8" containerName="dnsmasq-dns" Nov 25 15:27:29 crc kubenswrapper[4965]: E1125 15:27:29.024915 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067a3029-be3c-486c-bf8a-795d8f2e55f8" containerName="init" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.024924 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="067a3029-be3c-486c-bf8a-795d8f2e55f8" containerName="init" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.025199 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="067a3029-be3c-486c-bf8a-795d8f2e55f8" containerName="dnsmasq-dns" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.025979 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.028245 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.029133 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.029293 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-bshc5" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.050888 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.166236 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nn2m\" (UniqueName: \"kubernetes.io/projected/edff109d-255f-4c31-a010-896ca2068559-kube-api-access-5nn2m\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.166303 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edff109d-255f-4c31-a010-896ca2068559-combined-ca-bundle\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.166323 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/edff109d-255f-4c31-a010-896ca2068559-openstack-config-secret\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.167277 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/edff109d-255f-4c31-a010-896ca2068559-openstack-config\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.269530 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edff109d-255f-4c31-a010-896ca2068559-combined-ca-bundle\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.269907 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/edff109d-255f-4c31-a010-896ca2068559-openstack-config-secret\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.269999 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/edff109d-255f-4c31-a010-896ca2068559-openstack-config\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.270071 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nn2m\" (UniqueName: \"kubernetes.io/projected/edff109d-255f-4c31-a010-896ca2068559-kube-api-access-5nn2m\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.272868 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/edff109d-255f-4c31-a010-896ca2068559-openstack-config\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.277430 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edff109d-255f-4c31-a010-896ca2068559-combined-ca-bundle\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.281181 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/edff109d-255f-4c31-a010-896ca2068559-openstack-config-secret\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.303222 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nn2m\" (UniqueName: \"kubernetes.io/projected/edff109d-255f-4c31-a010-896ca2068559-kube-api-access-5nn2m\") pod \"openstackclient\" (UID: \"edff109d-255f-4c31-a010-896ca2068559\") " pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.346631 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.566429 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.612780 4965 generic.go:334] "Generic (PLEG): container finished" podID="65d9bb97-2095-4f82-81e9-2fb5fc578079" containerID="93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb" exitCode=0 Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.612822 4965 generic.go:334] "Generic (PLEG): container finished" podID="65d9bb97-2095-4f82-81e9-2fb5fc578079" containerID="525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7" exitCode=0 Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.613627 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.614040 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"65d9bb97-2095-4f82-81e9-2fb5fc578079","Type":"ContainerDied","Data":"93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb"} Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.614065 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"65d9bb97-2095-4f82-81e9-2fb5fc578079","Type":"ContainerDied","Data":"525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7"} Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.614075 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"65d9bb97-2095-4f82-81e9-2fb5fc578079","Type":"ContainerDied","Data":"37bcf8dd614b1bc2bd9e1e53ae696c7835e9296a156b04ef363a61d39cc66e6c"} Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.614092 4965 scope.go:117] "RemoveContainer" containerID="93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.676936 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data\") pod \"65d9bb97-2095-4f82-81e9-2fb5fc578079\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.677710 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data-custom\") pod \"65d9bb97-2095-4f82-81e9-2fb5fc578079\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.677740 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-scripts\") pod \"65d9bb97-2095-4f82-81e9-2fb5fc578079\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.678487 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-combined-ca-bundle\") pod \"65d9bb97-2095-4f82-81e9-2fb5fc578079\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.678532 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c6tc\" (UniqueName: \"kubernetes.io/projected/65d9bb97-2095-4f82-81e9-2fb5fc578079-kube-api-access-4c6tc\") pod \"65d9bb97-2095-4f82-81e9-2fb5fc578079\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.678609 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65d9bb97-2095-4f82-81e9-2fb5fc578079-etc-machine-id\") pod \"65d9bb97-2095-4f82-81e9-2fb5fc578079\" (UID: \"65d9bb97-2095-4f82-81e9-2fb5fc578079\") " Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.683046 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-scripts" (OuterVolumeSpecName: "scripts") pod "65d9bb97-2095-4f82-81e9-2fb5fc578079" (UID: "65d9bb97-2095-4f82-81e9-2fb5fc578079"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.683365 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65d9bb97-2095-4f82-81e9-2fb5fc578079-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "65d9bb97-2095-4f82-81e9-2fb5fc578079" (UID: "65d9bb97-2095-4f82-81e9-2fb5fc578079"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.686371 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.686405 4965 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65d9bb97-2095-4f82-81e9-2fb5fc578079-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.703498 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65d9bb97-2095-4f82-81e9-2fb5fc578079-kube-api-access-4c6tc" (OuterVolumeSpecName: "kube-api-access-4c6tc") pod "65d9bb97-2095-4f82-81e9-2fb5fc578079" (UID: "65d9bb97-2095-4f82-81e9-2fb5fc578079"). InnerVolumeSpecName "kube-api-access-4c6tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.703580 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "65d9bb97-2095-4f82-81e9-2fb5fc578079" (UID: "65d9bb97-2095-4f82-81e9-2fb5fc578079"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.724157 4965 scope.go:117] "RemoveContainer" containerID="525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.787887 4965 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.788489 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c6tc\" (UniqueName: \"kubernetes.io/projected/65d9bb97-2095-4f82-81e9-2fb5fc578079-kube-api-access-4c6tc\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.846108 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65d9bb97-2095-4f82-81e9-2fb5fc578079" (UID: "65d9bb97-2095-4f82-81e9-2fb5fc578079"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.875498 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data" (OuterVolumeSpecName: "config-data") pod "65d9bb97-2095-4f82-81e9-2fb5fc578079" (UID: "65d9bb97-2095-4f82-81e9-2fb5fc578079"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.875796 4965 scope.go:117] "RemoveContainer" containerID="93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb" Nov 25 15:27:29 crc kubenswrapper[4965]: E1125 15:27:29.879391 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb\": container with ID starting with 93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb not found: ID does not exist" containerID="93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.879437 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb"} err="failed to get container status \"93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb\": rpc error: code = NotFound desc = could not find container \"93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb\": container with ID starting with 93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb not found: ID does not exist" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.879461 4965 scope.go:117] "RemoveContainer" containerID="525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7" Nov 25 15:27:29 crc kubenswrapper[4965]: E1125 15:27:29.880363 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7\": container with ID starting with 525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7 not found: ID does not exist" containerID="525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.880455 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7"} err="failed to get container status \"525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7\": rpc error: code = NotFound desc = could not find container \"525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7\": container with ID starting with 525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7 not found: ID does not exist" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.880535 4965 scope.go:117] "RemoveContainer" containerID="93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.880950 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb"} err="failed to get container status \"93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb\": rpc error: code = NotFound desc = could not find container \"93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb\": container with ID starting with 93b9c79b5093fa07dd8fda820274464810e6820eb84015718b68f046b17ab2bb not found: ID does not exist" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.881014 4965 scope.go:117] "RemoveContainer" containerID="525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.881863 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7"} err="failed to get container status \"525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7\": rpc error: code = NotFound desc = could not find container \"525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7\": container with ID starting with 525ca1eb9912ab8ecef3765e452cc84aa949b662b37436efcb93ea27610a49b7 not found: ID does not exist" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.885847 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.889927 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.889955 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d9bb97-2095-4f82-81e9-2fb5fc578079-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:29 crc kubenswrapper[4965]: I1125 15:27:29.970109 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.006888 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.038865 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 15:27:30 crc kubenswrapper[4965]: E1125 15:27:30.039545 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d9bb97-2095-4f82-81e9-2fb5fc578079" containerName="cinder-scheduler" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.039570 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d9bb97-2095-4f82-81e9-2fb5fc578079" containerName="cinder-scheduler" Nov 25 15:27:30 crc kubenswrapper[4965]: E1125 15:27:30.039608 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d9bb97-2095-4f82-81e9-2fb5fc578079" containerName="probe" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.039615 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d9bb97-2095-4f82-81e9-2fb5fc578079" containerName="probe" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.040015 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="65d9bb97-2095-4f82-81e9-2fb5fc578079" containerName="cinder-scheduler" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.040052 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="65d9bb97-2095-4f82-81e9-2fb5fc578079" containerName="probe" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.041531 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.060081 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.082037 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 15:27:30 crc kubenswrapper[4965]: E1125 15:27:30.102787 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65d9bb97_2095_4f82_81e9_2fb5fc578079.slice\": RecentStats: unable to find data in memory cache]" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.114037 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvb59\" (UniqueName: \"kubernetes.io/projected/daa918e9-374b-4302-910d-496d7a0a746c-kube-api-access-kvb59\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.114089 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-scripts\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.114256 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/daa918e9-374b-4302-910d-496d7a0a746c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.114498 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.114576 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.114633 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-config-data\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.216820 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.216875 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.216903 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-config-data\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.216946 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvb59\" (UniqueName: \"kubernetes.io/projected/daa918e9-374b-4302-910d-496d7a0a746c-kube-api-access-kvb59\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.216986 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-scripts\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.217045 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/daa918e9-374b-4302-910d-496d7a0a746c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.217120 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/daa918e9-374b-4302-910d-496d7a0a746c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.221882 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.224367 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-scripts\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.224554 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.225197 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa918e9-374b-4302-910d-496d7a0a746c-config-data\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.235423 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvb59\" (UniqueName: \"kubernetes.io/projected/daa918e9-374b-4302-910d-496d7a0a746c-kube-api-access-kvb59\") pod \"cinder-scheduler-0\" (UID: \"daa918e9-374b-4302-910d-496d7a0a746c\") " pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.378601 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.628915 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"edff109d-255f-4c31-a010-896ca2068559","Type":"ContainerStarted","Data":"fbea0e03c43d3b36b074713cb565fa6be47cae71a71a60a0b2efd7af2a44a1bb"} Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.798544 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65d9bb97-2095-4f82-81e9-2fb5fc578079" path="/var/lib/kubelet/pods/65d9bb97-2095-4f82-81e9-2fb5fc578079/volumes" Nov 25 15:27:30 crc kubenswrapper[4965]: I1125 15:27:30.982327 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 15:27:31 crc kubenswrapper[4965]: I1125 15:27:31.115048 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 25 15:27:31 crc kubenswrapper[4965]: I1125 15:27:31.641782 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"daa918e9-374b-4302-910d-496d7a0a746c","Type":"ContainerStarted","Data":"85f542851866923bd33ba978cc70ad780888772be0b2d8c5754580207427d42e"} Nov 25 15:27:32 crc kubenswrapper[4965]: I1125 15:27:32.651915 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"daa918e9-374b-4302-910d-496d7a0a746c","Type":"ContainerStarted","Data":"75a8b377128bc23c156a2753ebf2e8b720341a80370b2d1b3193672b3af081c3"} Nov 25 15:27:32 crc kubenswrapper[4965]: I1125 15:27:32.652333 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"daa918e9-374b-4302-910d-496d7a0a746c","Type":"ContainerStarted","Data":"bdc158a801227f0aacae8758ad32a4c33b6372397fb218625b2d4e1ce1f8dd9f"} Nov 25 15:27:32 crc kubenswrapper[4965]: I1125 15:27:32.672785 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.672765822 podStartE2EDuration="3.672765822s" podCreationTimestamp="2025-11-25 15:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:27:32.669030221 +0000 UTC m=+1397.636623957" watchObservedRunningTime="2025-11-25 15:27:32.672765822 +0000 UTC m=+1397.640359568" Nov 25 15:27:35 crc kubenswrapper[4965]: I1125 15:27:35.379331 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 25 15:27:35 crc kubenswrapper[4965]: I1125 15:27:35.402215 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:27:35 crc kubenswrapper[4965]: I1125 15:27:35.479493 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b4f48597-hxm64"] Nov 25 15:27:35 crc kubenswrapper[4965]: I1125 15:27:35.479802 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" podUID="e37fd35e-99e8-4837-87f9-97a5dd1664bd" containerName="dnsmasq-dns" containerID="cri-o://fd17eef67b789033d7496f9201f46cb5076081113eaf11d48f90a1174002a4be" gracePeriod=10 Nov 25 15:27:35 crc kubenswrapper[4965]: I1125 15:27:35.704434 4965 generic.go:334] "Generic (PLEG): container finished" podID="e37fd35e-99e8-4837-87f9-97a5dd1664bd" containerID="fd17eef67b789033d7496f9201f46cb5076081113eaf11d48f90a1174002a4be" exitCode=0 Nov 25 15:27:35 crc kubenswrapper[4965]: I1125 15:27:35.704618 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" event={"ID":"e37fd35e-99e8-4837-87f9-97a5dd1664bd","Type":"ContainerDied","Data":"fd17eef67b789033d7496f9201f46cb5076081113eaf11d48f90a1174002a4be"} Nov 25 15:27:35 crc kubenswrapper[4965]: I1125 15:27:35.967040 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.035703 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-nb\") pod \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.035742 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56w9n\" (UniqueName: \"kubernetes.io/projected/e37fd35e-99e8-4837-87f9-97a5dd1664bd-kube-api-access-56w9n\") pod \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.035774 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-config\") pod \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.035812 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-dns-svc\") pod \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.035840 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-sb\") pod \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\" (UID: \"e37fd35e-99e8-4837-87f9-97a5dd1664bd\") " Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.061226 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37fd35e-99e8-4837-87f9-97a5dd1664bd-kube-api-access-56w9n" (OuterVolumeSpecName: "kube-api-access-56w9n") pod "e37fd35e-99e8-4837-87f9-97a5dd1664bd" (UID: "e37fd35e-99e8-4837-87f9-97a5dd1664bd"). InnerVolumeSpecName "kube-api-access-56w9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.122396 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-config" (OuterVolumeSpecName: "config") pod "e37fd35e-99e8-4837-87f9-97a5dd1664bd" (UID: "e37fd35e-99e8-4837-87f9-97a5dd1664bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.135232 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e37fd35e-99e8-4837-87f9-97a5dd1664bd" (UID: "e37fd35e-99e8-4837-87f9-97a5dd1664bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.140134 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56w9n\" (UniqueName: \"kubernetes.io/projected/e37fd35e-99e8-4837-87f9-97a5dd1664bd-kube-api-access-56w9n\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.140161 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.140173 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.143250 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e37fd35e-99e8-4837-87f9-97a5dd1664bd" (UID: "e37fd35e-99e8-4837-87f9-97a5dd1664bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.165620 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e37fd35e-99e8-4837-87f9-97a5dd1664bd" (UID: "e37fd35e-99e8-4837-87f9-97a5dd1664bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.241692 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.241720 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e37fd35e-99e8-4837-87f9-97a5dd1664bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.718388 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" event={"ID":"e37fd35e-99e8-4837-87f9-97a5dd1664bd","Type":"ContainerDied","Data":"f412cb553b3fac4d9424fa3604101a1a2ff63993ceffe351be17b3ffdaa62632"} Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.718440 4965 scope.go:117] "RemoveContainer" containerID="fd17eef67b789033d7496f9201f46cb5076081113eaf11d48f90a1174002a4be" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.718464 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b4f48597-hxm64" Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.757097 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b4f48597-hxm64"] Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.764680 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66b4f48597-hxm64"] Nov 25 15:27:36 crc kubenswrapper[4965]: I1125 15:27:36.781862 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37fd35e-99e8-4837-87f9-97a5dd1664bd" path="/var/lib/kubelet/pods/e37fd35e-99e8-4837-87f9-97a5dd1664bd/volumes" Nov 25 15:27:40 crc kubenswrapper[4965]: I1125 15:27:40.602634 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.209461 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.210773 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="proxy-httpd" containerID="cri-o://a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3" gracePeriod=30 Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.210794 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="ceilometer-notification-agent" containerID="cri-o://e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c" gracePeriod=30 Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.210693 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="ceilometer-central-agent" containerID="cri-o://44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a" gracePeriod=30 Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.211245 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="sg-core" containerID="cri-o://7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047" gracePeriod=30 Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.245104 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.828694 4965 generic.go:334] "Generic (PLEG): container finished" podID="ec223010-25ed-49bf-a840-9b04638e8de4" containerID="a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3" exitCode=0 Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.829061 4965 generic.go:334] "Generic (PLEG): container finished" podID="ec223010-25ed-49bf-a840-9b04638e8de4" containerID="7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047" exitCode=2 Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.829076 4965 generic.go:334] "Generic (PLEG): container finished" podID="ec223010-25ed-49bf-a840-9b04638e8de4" containerID="44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a" exitCode=0 Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.828763 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec223010-25ed-49bf-a840-9b04638e8de4","Type":"ContainerDied","Data":"a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3"} Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.829117 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec223010-25ed-49bf-a840-9b04638e8de4","Type":"ContainerDied","Data":"7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047"} Nov 25 15:27:42 crc kubenswrapper[4965]: I1125 15:27:42.829137 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec223010-25ed-49bf-a840-9b04638e8de4","Type":"ContainerDied","Data":"44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a"} Nov 25 15:27:43 crc kubenswrapper[4965]: I1125 15:27:43.336847 4965 scope.go:117] "RemoveContainer" containerID="2fbe0f0996a30cc98011f9fa28e3b412f84809219b072dadcf3746d5415197d3" Nov 25 15:27:43 crc kubenswrapper[4965]: I1125 15:27:43.592982 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.147:3000/\": dial tcp 10.217.0.147:3000: connect: connection refused" Nov 25 15:27:43 crc kubenswrapper[4965]: I1125 15:27:43.842056 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"edff109d-255f-4c31-a010-896ca2068559","Type":"ContainerStarted","Data":"66e70b2f6ec90c8ebf964f08d9e35f169e4689124fb86062d85f82e319f08b00"} Nov 25 15:27:43 crc kubenswrapper[4965]: I1125 15:27:43.861740 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.3673171530000001 podStartE2EDuration="14.861718137s" podCreationTimestamp="2025-11-25 15:27:29 +0000 UTC" firstStartedPulling="2025-11-25 15:27:29.881148554 +0000 UTC m=+1394.848742300" lastFinishedPulling="2025-11-25 15:27:43.375549518 +0000 UTC m=+1408.343143284" observedRunningTime="2025-11-25 15:27:43.857393559 +0000 UTC m=+1408.824987305" watchObservedRunningTime="2025-11-25 15:27:43.861718137 +0000 UTC m=+1408.829311883" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.370290 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.530526 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-log-httpd\") pod \"ec223010-25ed-49bf-a840-9b04638e8de4\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.530602 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-config-data\") pod \"ec223010-25ed-49bf-a840-9b04638e8de4\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.530681 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-combined-ca-bundle\") pod \"ec223010-25ed-49bf-a840-9b04638e8de4\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.530892 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-sg-core-conf-yaml\") pod \"ec223010-25ed-49bf-a840-9b04638e8de4\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.530952 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-run-httpd\") pod \"ec223010-25ed-49bf-a840-9b04638e8de4\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.531039 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxgtv\" (UniqueName: \"kubernetes.io/projected/ec223010-25ed-49bf-a840-9b04638e8de4-kube-api-access-kxgtv\") pod \"ec223010-25ed-49bf-a840-9b04638e8de4\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.531083 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-scripts\") pod \"ec223010-25ed-49bf-a840-9b04638e8de4\" (UID: \"ec223010-25ed-49bf-a840-9b04638e8de4\") " Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.531133 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ec223010-25ed-49bf-a840-9b04638e8de4" (UID: "ec223010-25ed-49bf-a840-9b04638e8de4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.531386 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ec223010-25ed-49bf-a840-9b04638e8de4" (UID: "ec223010-25ed-49bf-a840-9b04638e8de4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.531585 4965 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.531609 4965 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec223010-25ed-49bf-a840-9b04638e8de4-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.553364 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec223010-25ed-49bf-a840-9b04638e8de4-kube-api-access-kxgtv" (OuterVolumeSpecName: "kube-api-access-kxgtv") pod "ec223010-25ed-49bf-a840-9b04638e8de4" (UID: "ec223010-25ed-49bf-a840-9b04638e8de4"). InnerVolumeSpecName "kube-api-access-kxgtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.553370 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-scripts" (OuterVolumeSpecName: "scripts") pod "ec223010-25ed-49bf-a840-9b04638e8de4" (UID: "ec223010-25ed-49bf-a840-9b04638e8de4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.580034 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ec223010-25ed-49bf-a840-9b04638e8de4" (UID: "ec223010-25ed-49bf-a840-9b04638e8de4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.631170 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-config-data" (OuterVolumeSpecName: "config-data") pod "ec223010-25ed-49bf-a840-9b04638e8de4" (UID: "ec223010-25ed-49bf-a840-9b04638e8de4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.633171 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.633210 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.633221 4965 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.633232 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxgtv\" (UniqueName: \"kubernetes.io/projected/ec223010-25ed-49bf-a840-9b04638e8de4-kube-api-access-kxgtv\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.645206 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec223010-25ed-49bf-a840-9b04638e8de4" (UID: "ec223010-25ed-49bf-a840-9b04638e8de4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.735021 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec223010-25ed-49bf-a840-9b04638e8de4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.867182 4965 generic.go:334] "Generic (PLEG): container finished" podID="ec223010-25ed-49bf-a840-9b04638e8de4" containerID="e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c" exitCode=0 Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.867221 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec223010-25ed-49bf-a840-9b04638e8de4","Type":"ContainerDied","Data":"e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c"} Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.867283 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec223010-25ed-49bf-a840-9b04638e8de4","Type":"ContainerDied","Data":"efe6ed093ce42c4b83e4162b74322ba0d2db21415a744f1e43f69db5e881bf39"} Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.867300 4965 scope.go:117] "RemoveContainer" containerID="a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.867516 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.883078 4965 scope.go:117] "RemoveContainer" containerID="7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.896388 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.901874 4965 scope.go:117] "RemoveContainer" containerID="e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.905976 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925211 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:46 crc kubenswrapper[4965]: E1125 15:27:46.925541 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="sg-core" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925552 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="sg-core" Nov 25 15:27:46 crc kubenswrapper[4965]: E1125 15:27:46.925563 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="proxy-httpd" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925568 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="proxy-httpd" Nov 25 15:27:46 crc kubenswrapper[4965]: E1125 15:27:46.925582 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="ceilometer-central-agent" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925588 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="ceilometer-central-agent" Nov 25 15:27:46 crc kubenswrapper[4965]: E1125 15:27:46.925601 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37fd35e-99e8-4837-87f9-97a5dd1664bd" containerName="init" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925606 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37fd35e-99e8-4837-87f9-97a5dd1664bd" containerName="init" Nov 25 15:27:46 crc kubenswrapper[4965]: E1125 15:27:46.925615 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37fd35e-99e8-4837-87f9-97a5dd1664bd" containerName="dnsmasq-dns" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925621 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37fd35e-99e8-4837-87f9-97a5dd1664bd" containerName="dnsmasq-dns" Nov 25 15:27:46 crc kubenswrapper[4965]: E1125 15:27:46.925635 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="ceilometer-notification-agent" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925641 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="ceilometer-notification-agent" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925778 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37fd35e-99e8-4837-87f9-97a5dd1664bd" containerName="dnsmasq-dns" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925797 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="ceilometer-central-agent" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925805 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="sg-core" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925816 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="proxy-httpd" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.925829 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" containerName="ceilometer-notification-agent" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.927200 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.932362 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.936395 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.950290 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:46 crc kubenswrapper[4965]: I1125 15:27:46.961456 4965 scope.go:117] "RemoveContainer" containerID="44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.002763 4965 scope.go:117] "RemoveContainer" containerID="a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3" Nov 25 15:27:47 crc kubenswrapper[4965]: E1125 15:27:47.007114 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3\": container with ID starting with a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3 not found: ID does not exist" containerID="a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.007163 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3"} err="failed to get container status \"a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3\": rpc error: code = NotFound desc = could not find container \"a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3\": container with ID starting with a65165ed0235e02f702a42eaa5a540913e294741385a608d9278b6b53f767eb3 not found: ID does not exist" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.007217 4965 scope.go:117] "RemoveContainer" containerID="7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047" Nov 25 15:27:47 crc kubenswrapper[4965]: E1125 15:27:47.007572 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047\": container with ID starting with 7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047 not found: ID does not exist" containerID="7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.007588 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047"} err="failed to get container status \"7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047\": rpc error: code = NotFound desc = could not find container \"7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047\": container with ID starting with 7ce8b0f24c0a720dfded814bee265d233989c673a3d08634ae83d7d94c48d047 not found: ID does not exist" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.007599 4965 scope.go:117] "RemoveContainer" containerID="e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c" Nov 25 15:27:47 crc kubenswrapper[4965]: E1125 15:27:47.007812 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c\": container with ID starting with e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c not found: ID does not exist" containerID="e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.007830 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c"} err="failed to get container status \"e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c\": rpc error: code = NotFound desc = could not find container \"e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c\": container with ID starting with e4526a08785417b94704c36ebda5c4ab5c6984de108442f443561bc2de707b0c not found: ID does not exist" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.007841 4965 scope.go:117] "RemoveContainer" containerID="44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a" Nov 25 15:27:47 crc kubenswrapper[4965]: E1125 15:27:47.008031 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a\": container with ID starting with 44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a not found: ID does not exist" containerID="44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.008046 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a"} err="failed to get container status \"44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a\": rpc error: code = NotFound desc = could not find container \"44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a\": container with ID starting with 44288468b0dc368a1ad7b36d8313e02e9c0361d7e29e0554e4cf179a63fd283a not found: ID does not exist" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.041553 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-run-httpd\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.041620 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-log-httpd\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.041675 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-scripts\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.041762 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.041789 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-config-data\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.041810 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.041867 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzwdc\" (UniqueName: \"kubernetes.io/projected/08e30ac4-238c-422b-bc13-aaeb4190ac38-kube-api-access-mzwdc\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.142937 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-scripts\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.143267 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.143377 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-config-data\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.143466 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.143594 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzwdc\" (UniqueName: \"kubernetes.io/projected/08e30ac4-238c-422b-bc13-aaeb4190ac38-kube-api-access-mzwdc\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.143760 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-run-httpd\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.143848 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-log-httpd\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.144327 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-run-httpd\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.144433 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-log-httpd\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.146697 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.147372 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-config-data\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.152194 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-scripts\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.152607 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.165315 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzwdc\" (UniqueName: \"kubernetes.io/projected/08e30ac4-238c-422b-bc13-aaeb4190ac38-kube-api-access-mzwdc\") pod \"ceilometer-0\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.305224 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.835451 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:47 crc kubenswrapper[4965]: I1125 15:27:47.877216 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08e30ac4-238c-422b-bc13-aaeb4190ac38","Type":"ContainerStarted","Data":"f795c56bf927ebb642130969c1f508b17dd4540e427777dbe4aef62a30eced62"} Nov 25 15:27:48 crc kubenswrapper[4965]: I1125 15:27:48.782202 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec223010-25ed-49bf-a840-9b04638e8de4" path="/var/lib/kubelet/pods/ec223010-25ed-49bf-a840-9b04638e8de4/volumes" Nov 25 15:27:48 crc kubenswrapper[4965]: I1125 15:27:48.886803 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08e30ac4-238c-422b-bc13-aaeb4190ac38","Type":"ContainerStarted","Data":"bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712"} Nov 25 15:27:49 crc kubenswrapper[4965]: I1125 15:27:49.802295 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-cszg8"] Nov 25 15:27:49 crc kubenswrapper[4965]: I1125 15:27:49.804209 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cszg8" Nov 25 15:27:49 crc kubenswrapper[4965]: I1125 15:27:49.829129 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cszg8"] Nov 25 15:27:49 crc kubenswrapper[4965]: I1125 15:27:49.893895 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/668762c3-71e3-42b7-974e-734b02cdbc1c-operator-scripts\") pod \"nova-api-db-create-cszg8\" (UID: \"668762c3-71e3-42b7-974e-734b02cdbc1c\") " pod="openstack/nova-api-db-create-cszg8" Nov 25 15:27:49 crc kubenswrapper[4965]: I1125 15:27:49.894380 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65b8d\" (UniqueName: \"kubernetes.io/projected/668762c3-71e3-42b7-974e-734b02cdbc1c-kube-api-access-65b8d\") pod \"nova-api-db-create-cszg8\" (UID: \"668762c3-71e3-42b7-974e-734b02cdbc1c\") " pod="openstack/nova-api-db-create-cszg8" Nov 25 15:27:49 crc kubenswrapper[4965]: I1125 15:27:49.905102 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mlzsx"] Nov 25 15:27:49 crc kubenswrapper[4965]: I1125 15:27:49.906379 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mlzsx" Nov 25 15:27:49 crc kubenswrapper[4965]: I1125 15:27:49.931018 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mlzsx"] Nov 25 15:27:49 crc kubenswrapper[4965]: I1125 15:27:49.961701 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08e30ac4-238c-422b-bc13-aaeb4190ac38","Type":"ContainerStarted","Data":"9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295"} Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:49.997123 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/668762c3-71e3-42b7-974e-734b02cdbc1c-operator-scripts\") pod \"nova-api-db-create-cszg8\" (UID: \"668762c3-71e3-42b7-974e-734b02cdbc1c\") " pod="openstack/nova-api-db-create-cszg8" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:49.997218 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65b8d\" (UniqueName: \"kubernetes.io/projected/668762c3-71e3-42b7-974e-734b02cdbc1c-kube-api-access-65b8d\") pod \"nova-api-db-create-cszg8\" (UID: \"668762c3-71e3-42b7-974e-734b02cdbc1c\") " pod="openstack/nova-api-db-create-cszg8" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:49.997246 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6a5305f-d6dc-4bf3-b356-4d278731672d-operator-scripts\") pod \"nova-cell0-db-create-mlzsx\" (UID: \"f6a5305f-d6dc-4bf3-b356-4d278731672d\") " pod="openstack/nova-cell0-db-create-mlzsx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:49.997269 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr8dw\" (UniqueName: \"kubernetes.io/projected/f6a5305f-d6dc-4bf3-b356-4d278731672d-kube-api-access-sr8dw\") pod \"nova-cell0-db-create-mlzsx\" (UID: \"f6a5305f-d6dc-4bf3-b356-4d278731672d\") " pod="openstack/nova-cell0-db-create-mlzsx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:49.998258 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/668762c3-71e3-42b7-974e-734b02cdbc1c-operator-scripts\") pod \"nova-api-db-create-cszg8\" (UID: \"668762c3-71e3-42b7-974e-734b02cdbc1c\") " pod="openstack/nova-api-db-create-cszg8" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.008797 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4f01-account-create-6rdbh"] Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.009951 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4f01-account-create-6rdbh" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.011925 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.016190 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4f01-account-create-6rdbh"] Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.022470 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65b8d\" (UniqueName: \"kubernetes.io/projected/668762c3-71e3-42b7-974e-734b02cdbc1c-kube-api-access-65b8d\") pod \"nova-api-db-create-cszg8\" (UID: \"668762c3-71e3-42b7-974e-734b02cdbc1c\") " pod="openstack/nova-api-db-create-cszg8" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.089224 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-pjhn7"] Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.095991 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pjhn7" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.102153 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dghj\" (UniqueName: \"kubernetes.io/projected/b2b31744-51a0-4e7e-8239-0ca59000796e-kube-api-access-8dghj\") pod \"nova-api-4f01-account-create-6rdbh\" (UID: \"b2b31744-51a0-4e7e-8239-0ca59000796e\") " pod="openstack/nova-api-4f01-account-create-6rdbh" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.102296 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6a5305f-d6dc-4bf3-b356-4d278731672d-operator-scripts\") pod \"nova-cell0-db-create-mlzsx\" (UID: \"f6a5305f-d6dc-4bf3-b356-4d278731672d\") " pod="openstack/nova-cell0-db-create-mlzsx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.102319 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr8dw\" (UniqueName: \"kubernetes.io/projected/f6a5305f-d6dc-4bf3-b356-4d278731672d-kube-api-access-sr8dw\") pod \"nova-cell0-db-create-mlzsx\" (UID: \"f6a5305f-d6dc-4bf3-b356-4d278731672d\") " pod="openstack/nova-cell0-db-create-mlzsx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.102354 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b31744-51a0-4e7e-8239-0ca59000796e-operator-scripts\") pod \"nova-api-4f01-account-create-6rdbh\" (UID: \"b2b31744-51a0-4e7e-8239-0ca59000796e\") " pod="openstack/nova-api-4f01-account-create-6rdbh" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.102979 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6a5305f-d6dc-4bf3-b356-4d278731672d-operator-scripts\") pod \"nova-cell0-db-create-mlzsx\" (UID: \"f6a5305f-d6dc-4bf3-b356-4d278731672d\") " pod="openstack/nova-cell0-db-create-mlzsx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.103049 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pjhn7"] Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.137234 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr8dw\" (UniqueName: \"kubernetes.io/projected/f6a5305f-d6dc-4bf3-b356-4d278731672d-kube-api-access-sr8dw\") pod \"nova-cell0-db-create-mlzsx\" (UID: \"f6a5305f-d6dc-4bf3-b356-4d278731672d\") " pod="openstack/nova-cell0-db-create-mlzsx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.169157 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cszg8" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.191351 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ea7f-account-create-m7smz"] Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.192384 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea7f-account-create-m7smz" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.198322 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.206008 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b31744-51a0-4e7e-8239-0ca59000796e-operator-scripts\") pod \"nova-api-4f01-account-create-6rdbh\" (UID: \"b2b31744-51a0-4e7e-8239-0ca59000796e\") " pod="openstack/nova-api-4f01-account-create-6rdbh" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.206071 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-operator-scripts\") pod \"nova-cell1-db-create-pjhn7\" (UID: \"43fd6c78-af14-40ad-9fca-ce8a2d0370ba\") " pod="openstack/nova-cell1-db-create-pjhn7" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.206128 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqjjh\" (UniqueName: \"kubernetes.io/projected/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-kube-api-access-xqjjh\") pod \"nova-cell1-db-create-pjhn7\" (UID: \"43fd6c78-af14-40ad-9fca-ce8a2d0370ba\") " pod="openstack/nova-cell1-db-create-pjhn7" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.206174 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dghj\" (UniqueName: \"kubernetes.io/projected/b2b31744-51a0-4e7e-8239-0ca59000796e-kube-api-access-8dghj\") pod \"nova-api-4f01-account-create-6rdbh\" (UID: \"b2b31744-51a0-4e7e-8239-0ca59000796e\") " pod="openstack/nova-api-4f01-account-create-6rdbh" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.207111 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b31744-51a0-4e7e-8239-0ca59000796e-operator-scripts\") pod \"nova-api-4f01-account-create-6rdbh\" (UID: \"b2b31744-51a0-4e7e-8239-0ca59000796e\") " pod="openstack/nova-api-4f01-account-create-6rdbh" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.233236 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dghj\" (UniqueName: \"kubernetes.io/projected/b2b31744-51a0-4e7e-8239-0ca59000796e-kube-api-access-8dghj\") pod \"nova-api-4f01-account-create-6rdbh\" (UID: \"b2b31744-51a0-4e7e-8239-0ca59000796e\") " pod="openstack/nova-api-4f01-account-create-6rdbh" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.234369 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ea7f-account-create-m7smz"] Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.275926 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mlzsx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.308083 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-operator-scripts\") pod \"nova-cell1-db-create-pjhn7\" (UID: \"43fd6c78-af14-40ad-9fca-ce8a2d0370ba\") " pod="openstack/nova-cell1-db-create-pjhn7" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.308134 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqjjh\" (UniqueName: \"kubernetes.io/projected/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-kube-api-access-xqjjh\") pod \"nova-cell1-db-create-pjhn7\" (UID: \"43fd6c78-af14-40ad-9fca-ce8a2d0370ba\") " pod="openstack/nova-cell1-db-create-pjhn7" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.308156 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/443303d8-fe0f-4af1-8ce8-0e99085d5e49-operator-scripts\") pod \"nova-cell0-ea7f-account-create-m7smz\" (UID: \"443303d8-fe0f-4af1-8ce8-0e99085d5e49\") " pod="openstack/nova-cell0-ea7f-account-create-m7smz" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.308182 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfxbs\" (UniqueName: \"kubernetes.io/projected/443303d8-fe0f-4af1-8ce8-0e99085d5e49-kube-api-access-jfxbs\") pod \"nova-cell0-ea7f-account-create-m7smz\" (UID: \"443303d8-fe0f-4af1-8ce8-0e99085d5e49\") " pod="openstack/nova-cell0-ea7f-account-create-m7smz" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.309743 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-operator-scripts\") pod \"nova-cell1-db-create-pjhn7\" (UID: \"43fd6c78-af14-40ad-9fca-ce8a2d0370ba\") " pod="openstack/nova-cell1-db-create-pjhn7" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.328314 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqjjh\" (UniqueName: \"kubernetes.io/projected/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-kube-api-access-xqjjh\") pod \"nova-cell1-db-create-pjhn7\" (UID: \"43fd6c78-af14-40ad-9fca-ce8a2d0370ba\") " pod="openstack/nova-cell1-db-create-pjhn7" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.393473 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4f01-account-create-6rdbh" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.402792 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.409399 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/443303d8-fe0f-4af1-8ce8-0e99085d5e49-operator-scripts\") pod \"nova-cell0-ea7f-account-create-m7smz\" (UID: \"443303d8-fe0f-4af1-8ce8-0e99085d5e49\") " pod="openstack/nova-cell0-ea7f-account-create-m7smz" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.409447 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfxbs\" (UniqueName: \"kubernetes.io/projected/443303d8-fe0f-4af1-8ce8-0e99085d5e49-kube-api-access-jfxbs\") pod \"nova-cell0-ea7f-account-create-m7smz\" (UID: \"443303d8-fe0f-4af1-8ce8-0e99085d5e49\") " pod="openstack/nova-cell0-ea7f-account-create-m7smz" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.410318 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/443303d8-fe0f-4af1-8ce8-0e99085d5e49-operator-scripts\") pod \"nova-cell0-ea7f-account-create-m7smz\" (UID: \"443303d8-fe0f-4af1-8ce8-0e99085d5e49\") " pod="openstack/nova-cell0-ea7f-account-create-m7smz" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.416501 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pjhn7" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.450470 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfxbs\" (UniqueName: \"kubernetes.io/projected/443303d8-fe0f-4af1-8ce8-0e99085d5e49-kube-api-access-jfxbs\") pod \"nova-cell0-ea7f-account-create-m7smz\" (UID: \"443303d8-fe0f-4af1-8ce8-0e99085d5e49\") " pod="openstack/nova-cell0-ea7f-account-create-m7smz" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.458695 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f6d8-account-create-mfxnx"] Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.459944 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f6d8-account-create-mfxnx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.464149 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.474233 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f6d8-account-create-mfxnx"] Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.600543 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea7f-account-create-m7smz" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.613951 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd5tt\" (UniqueName: \"kubernetes.io/projected/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-kube-api-access-fd5tt\") pod \"nova-cell1-f6d8-account-create-mfxnx\" (UID: \"ada44be1-c488-4ad3-bee0-fe9c0d7b84af\") " pod="openstack/nova-cell1-f6d8-account-create-mfxnx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.614052 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-operator-scripts\") pod \"nova-cell1-f6d8-account-create-mfxnx\" (UID: \"ada44be1-c488-4ad3-bee0-fe9c0d7b84af\") " pod="openstack/nova-cell1-f6d8-account-create-mfxnx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.719733 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd5tt\" (UniqueName: \"kubernetes.io/projected/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-kube-api-access-fd5tt\") pod \"nova-cell1-f6d8-account-create-mfxnx\" (UID: \"ada44be1-c488-4ad3-bee0-fe9c0d7b84af\") " pod="openstack/nova-cell1-f6d8-account-create-mfxnx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.720082 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-operator-scripts\") pod \"nova-cell1-f6d8-account-create-mfxnx\" (UID: \"ada44be1-c488-4ad3-bee0-fe9c0d7b84af\") " pod="openstack/nova-cell1-f6d8-account-create-mfxnx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.720886 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-operator-scripts\") pod \"nova-cell1-f6d8-account-create-mfxnx\" (UID: \"ada44be1-c488-4ad3-bee0-fe9c0d7b84af\") " pod="openstack/nova-cell1-f6d8-account-create-mfxnx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.745878 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd5tt\" (UniqueName: \"kubernetes.io/projected/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-kube-api-access-fd5tt\") pod \"nova-cell1-f6d8-account-create-mfxnx\" (UID: \"ada44be1-c488-4ad3-bee0-fe9c0d7b84af\") " pod="openstack/nova-cell1-f6d8-account-create-mfxnx" Nov 25 15:27:50 crc kubenswrapper[4965]: E1125 15:27:50.838723 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfef716b_a6c5_4237_91b5_0078ce32f42d.slice/crio-5fcc3cacc4c27424c7b4880613b33f76e42f1e22acada5c73c1b4a1e28b9a926.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfef716b_a6c5_4237_91b5_0078ce32f42d.slice/crio-conmon-5fcc3cacc4c27424c7b4880613b33f76e42f1e22acada5c73c1b4a1e28b9a926.scope\": RecentStats: unable to find data in memory cache]" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.861472 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f6d8-account-create-mfxnx" Nov 25 15:27:50 crc kubenswrapper[4965]: I1125 15:27:50.878592 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cszg8"] Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.064742 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4f01-account-create-6rdbh"] Nov 25 15:27:51 crc kubenswrapper[4965]: W1125 15:27:51.093746 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2b31744_51a0_4e7e_8239_0ca59000796e.slice/crio-7c629a42f0da5335852ec9cccde843f6ba0da13483282a327be8e25fa8b1d2cf WatchSource:0}: Error finding container 7c629a42f0da5335852ec9cccde843f6ba0da13483282a327be8e25fa8b1d2cf: Status 404 returned error can't find the container with id 7c629a42f0da5335852ec9cccde843f6ba0da13483282a327be8e25fa8b1d2cf Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.101715 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mlzsx"] Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.161426 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cszg8" event={"ID":"668762c3-71e3-42b7-974e-734b02cdbc1c","Type":"ContainerStarted","Data":"3049b70d416287a563833660cc427a7e9657e7314a8a3a141d0718aee2dc0f55"} Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.204907 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08e30ac4-238c-422b-bc13-aaeb4190ac38","Type":"ContainerStarted","Data":"d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e"} Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.261982 4965 generic.go:334] "Generic (PLEG): container finished" podID="bfef716b-a6c5-4237-91b5-0078ce32f42d" containerID="5fcc3cacc4c27424c7b4880613b33f76e42f1e22acada5c73c1b4a1e28b9a926" exitCode=137 Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.262036 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfef716b-a6c5-4237-91b5-0078ce32f42d","Type":"ContainerDied","Data":"5fcc3cacc4c27424c7b4880613b33f76e42f1e22acada5c73c1b4a1e28b9a926"} Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.505993 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pjhn7"] Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.723314 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ea7f-account-create-m7smz"] Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.797611 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.875309 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfef716b-a6c5-4237-91b5-0078ce32f42d-logs\") pod \"bfef716b-a6c5-4237-91b5-0078ce32f42d\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.875354 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data\") pod \"bfef716b-a6c5-4237-91b5-0078ce32f42d\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.875465 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-combined-ca-bundle\") pod \"bfef716b-a6c5-4237-91b5-0078ce32f42d\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.875624 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhzjz\" (UniqueName: \"kubernetes.io/projected/bfef716b-a6c5-4237-91b5-0078ce32f42d-kube-api-access-rhzjz\") pod \"bfef716b-a6c5-4237-91b5-0078ce32f42d\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.875655 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data-custom\") pod \"bfef716b-a6c5-4237-91b5-0078ce32f42d\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.875716 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-scripts\") pod \"bfef716b-a6c5-4237-91b5-0078ce32f42d\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.875770 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfef716b-a6c5-4237-91b5-0078ce32f42d-etc-machine-id\") pod \"bfef716b-a6c5-4237-91b5-0078ce32f42d\" (UID: \"bfef716b-a6c5-4237-91b5-0078ce32f42d\") " Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.876000 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfef716b-a6c5-4237-91b5-0078ce32f42d-logs" (OuterVolumeSpecName: "logs") pod "bfef716b-a6c5-4237-91b5-0078ce32f42d" (UID: "bfef716b-a6c5-4237-91b5-0078ce32f42d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.876120 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfef716b-a6c5-4237-91b5-0078ce32f42d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bfef716b-a6c5-4237-91b5-0078ce32f42d" (UID: "bfef716b-a6c5-4237-91b5-0078ce32f42d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.876592 4965 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfef716b-a6c5-4237-91b5-0078ce32f42d-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.876613 4965 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfef716b-a6c5-4237-91b5-0078ce32f42d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.905884 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bfef716b-a6c5-4237-91b5-0078ce32f42d" (UID: "bfef716b-a6c5-4237-91b5-0078ce32f42d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.922912 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-scripts" (OuterVolumeSpecName: "scripts") pod "bfef716b-a6c5-4237-91b5-0078ce32f42d" (UID: "bfef716b-a6c5-4237-91b5-0078ce32f42d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.942845 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfef716b-a6c5-4237-91b5-0078ce32f42d-kube-api-access-rhzjz" (OuterVolumeSpecName: "kube-api-access-rhzjz") pod "bfef716b-a6c5-4237-91b5-0078ce32f42d" (UID: "bfef716b-a6c5-4237-91b5-0078ce32f42d"). InnerVolumeSpecName "kube-api-access-rhzjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.977669 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhzjz\" (UniqueName: \"kubernetes.io/projected/bfef716b-a6c5-4237-91b5-0078ce32f42d-kube-api-access-rhzjz\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.977702 4965 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:51 crc kubenswrapper[4965]: I1125 15:27:51.977711 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.094507 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f6d8-account-create-mfxnx"] Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.111959 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfef716b-a6c5-4237-91b5-0078ce32f42d" (UID: "bfef716b-a6c5-4237-91b5-0078ce32f42d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.151138 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data" (OuterVolumeSpecName: "config-data") pod "bfef716b-a6c5-4237-91b5-0078ce32f42d" (UID: "bfef716b-a6c5-4237-91b5-0078ce32f42d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:52 crc kubenswrapper[4965]: W1125 15:27:52.156257 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podada44be1_c488_4ad3_bee0_fe9c0d7b84af.slice/crio-b777ec741177c73853941fc2b408e91d6126cdb32f78d195216941ec0409d92e WatchSource:0}: Error finding container b777ec741177c73853941fc2b408e91d6126cdb32f78d195216941ec0409d92e: Status 404 returned error can't find the container with id b777ec741177c73853941fc2b408e91d6126cdb32f78d195216941ec0409d92e Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.181000 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.181038 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfef716b-a6c5-4237-91b5-0078ce32f42d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.279826 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cszg8" event={"ID":"668762c3-71e3-42b7-974e-734b02cdbc1c","Type":"ContainerStarted","Data":"2b708fc65e9bad06225fe7eb94f5f0fd7bf4aab88576225fe591407a02fac35f"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.293232 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pjhn7" event={"ID":"43fd6c78-af14-40ad-9fca-ce8a2d0370ba","Type":"ContainerStarted","Data":"e55b868f3f91ca9a8d6da8bb998e9fa66ea93128ac3fa1b050ba26a8a75a169f"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.293312 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pjhn7" event={"ID":"43fd6c78-af14-40ad-9fca-ce8a2d0370ba","Type":"ContainerStarted","Data":"ffbbdb75b749481cd56f542e90f42a5b639e9f5d39e23ff71d19040e5beda1ca"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.302900 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea7f-account-create-m7smz" event={"ID":"443303d8-fe0f-4af1-8ce8-0e99085d5e49","Type":"ContainerStarted","Data":"2882683b376024b3a00294d67797ac5fca63691b16ef26abf8edb6e8df7cd87e"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.302946 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea7f-account-create-m7smz" event={"ID":"443303d8-fe0f-4af1-8ce8-0e99085d5e49","Type":"ContainerStarted","Data":"3e50b86bc8134df6d43dae6d3d2f49795d93954df272baf66df8b180beaca890"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.311540 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-cszg8" podStartSLOduration=3.311504219 podStartE2EDuration="3.311504219s" podCreationTimestamp="2025-11-25 15:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:27:52.294463006 +0000 UTC m=+1417.262056752" watchObservedRunningTime="2025-11-25 15:27:52.311504219 +0000 UTC m=+1417.279097965" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.316862 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-pjhn7" podStartSLOduration=2.316842754 podStartE2EDuration="2.316842754s" podCreationTimestamp="2025-11-25 15:27:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:27:52.309943997 +0000 UTC m=+1417.277537743" watchObservedRunningTime="2025-11-25 15:27:52.316842754 +0000 UTC m=+1417.284436500" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.320416 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f6d8-account-create-mfxnx" event={"ID":"ada44be1-c488-4ad3-bee0-fe9c0d7b84af","Type":"ContainerStarted","Data":"b777ec741177c73853941fc2b408e91d6126cdb32f78d195216941ec0409d92e"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.335801 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfef716b-a6c5-4237-91b5-0078ce32f42d","Type":"ContainerDied","Data":"de561b464461acfd0dcad1947da3ff12fd7f99a08649be49b08b0f6ee382bf53"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.335864 4965 scope.go:117] "RemoveContainer" containerID="5fcc3cacc4c27424c7b4880613b33f76e42f1e22acada5c73c1b4a1e28b9a926" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.336044 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.339469 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mlzsx" event={"ID":"f6a5305f-d6dc-4bf3-b356-4d278731672d","Type":"ContainerStarted","Data":"fe86c3c0a2e0e7d0fa27ce05e8e36e30f97266e6cdfb280f738d4b22436f4287"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.339511 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mlzsx" event={"ID":"f6a5305f-d6dc-4bf3-b356-4d278731672d","Type":"ContainerStarted","Data":"8dd6a3d8bf7129173fb85bc745080fab40fa6d2d663f47e0b5f12ce268a30854"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.357142 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4f01-account-create-6rdbh" event={"ID":"b2b31744-51a0-4e7e-8239-0ca59000796e","Type":"ContainerStarted","Data":"7c41b069cd66df87c972997bcbd7e6b850c5701c808a405b9762dac3f41e4caa"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.357200 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4f01-account-create-6rdbh" event={"ID":"b2b31744-51a0-4e7e-8239-0ca59000796e","Type":"ContainerStarted","Data":"7c629a42f0da5335852ec9cccde843f6ba0da13483282a327be8e25fa8b1d2cf"} Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.392405 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-ea7f-account-create-m7smz" podStartSLOduration=2.392384578 podStartE2EDuration="2.392384578s" podCreationTimestamp="2025-11-25 15:27:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:27:52.373232327 +0000 UTC m=+1417.340826073" watchObservedRunningTime="2025-11-25 15:27:52.392384578 +0000 UTC m=+1417.359978324" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.408390 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-mlzsx" podStartSLOduration=3.408346443 podStartE2EDuration="3.408346443s" podCreationTimestamp="2025-11-25 15:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:27:52.404899679 +0000 UTC m=+1417.372493415" watchObservedRunningTime="2025-11-25 15:27:52.408346443 +0000 UTC m=+1417.375940189" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.434132 4965 scope.go:117] "RemoveContainer" containerID="fe7609d42fea400dcd0e23317a4192bac803d9f9ea080c81804eb10081e4b1a8" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.461023 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.472248 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.498059 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 25 15:27:52 crc kubenswrapper[4965]: E1125 15:27:52.498779 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfef716b-a6c5-4237-91b5-0078ce32f42d" containerName="cinder-api-log" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.498864 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfef716b-a6c5-4237-91b5-0078ce32f42d" containerName="cinder-api-log" Nov 25 15:27:52 crc kubenswrapper[4965]: E1125 15:27:52.498950 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfef716b-a6c5-4237-91b5-0078ce32f42d" containerName="cinder-api" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.499049 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfef716b-a6c5-4237-91b5-0078ce32f42d" containerName="cinder-api" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.499332 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfef716b-a6c5-4237-91b5-0078ce32f42d" containerName="cinder-api-log" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.503058 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfef716b-a6c5-4237-91b5-0078ce32f42d" containerName="cinder-api" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.504375 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.513506 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.513734 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.513879 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.515489 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.586598 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59dzz\" (UniqueName: \"kubernetes.io/projected/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-kube-api-access-59dzz\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.586932 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-scripts\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.587046 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.587126 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.587233 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-config-data\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.587310 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.587411 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.587513 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-logs\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.587609 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.689160 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59dzz\" (UniqueName: \"kubernetes.io/projected/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-kube-api-access-59dzz\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.689757 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-scripts\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.690497 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.690593 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.690737 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-config-data\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.690893 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.691072 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.691160 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-logs\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.691241 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.691369 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.691630 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-logs\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.694146 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.694659 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-scripts\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.694730 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-config-data\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.695762 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.697287 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.708294 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.713649 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59dzz\" (UniqueName: \"kubernetes.io/projected/7a7b2938-42cd-4dee-b5fe-2c85b4bea92f-kube-api-access-59dzz\") pod \"cinder-api-0\" (UID: \"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f\") " pod="openstack/cinder-api-0" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.781773 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfef716b-a6c5-4237-91b5-0078ce32f42d" path="/var/lib/kubelet/pods/bfef716b-a6c5-4237-91b5-0078ce32f42d/volumes" Nov 25 15:27:52 crc kubenswrapper[4965]: I1125 15:27:52.836253 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.356344 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.370772 4965 generic.go:334] "Generic (PLEG): container finished" podID="f6a5305f-d6dc-4bf3-b356-4d278731672d" containerID="fe86c3c0a2e0e7d0fa27ce05e8e36e30f97266e6cdfb280f738d4b22436f4287" exitCode=0 Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.370881 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mlzsx" event={"ID":"f6a5305f-d6dc-4bf3-b356-4d278731672d","Type":"ContainerDied","Data":"fe86c3c0a2e0e7d0fa27ce05e8e36e30f97266e6cdfb280f738d4b22436f4287"} Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.372290 4965 generic.go:334] "Generic (PLEG): container finished" podID="b2b31744-51a0-4e7e-8239-0ca59000796e" containerID="7c41b069cd66df87c972997bcbd7e6b850c5701c808a405b9762dac3f41e4caa" exitCode=0 Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.372351 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4f01-account-create-6rdbh" event={"ID":"b2b31744-51a0-4e7e-8239-0ca59000796e","Type":"ContainerDied","Data":"7c41b069cd66df87c972997bcbd7e6b850c5701c808a405b9762dac3f41e4caa"} Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.374672 4965 generic.go:334] "Generic (PLEG): container finished" podID="668762c3-71e3-42b7-974e-734b02cdbc1c" containerID="2b708fc65e9bad06225fe7eb94f5f0fd7bf4aab88576225fe591407a02fac35f" exitCode=0 Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.374748 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cszg8" event={"ID":"668762c3-71e3-42b7-974e-734b02cdbc1c","Type":"ContainerDied","Data":"2b708fc65e9bad06225fe7eb94f5f0fd7bf4aab88576225fe591407a02fac35f"} Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.389385 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08e30ac4-238c-422b-bc13-aaeb4190ac38","Type":"ContainerStarted","Data":"843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14"} Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.389519 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="ceilometer-central-agent" containerID="cri-o://bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712" gracePeriod=30 Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.389562 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.389617 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="sg-core" containerID="cri-o://d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e" gracePeriod=30 Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.389654 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="ceilometer-notification-agent" containerID="cri-o://9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295" gracePeriod=30 Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.389697 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="proxy-httpd" containerID="cri-o://843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14" gracePeriod=30 Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.401205 4965 generic.go:334] "Generic (PLEG): container finished" podID="43fd6c78-af14-40ad-9fca-ce8a2d0370ba" containerID="e55b868f3f91ca9a8d6da8bb998e9fa66ea93128ac3fa1b050ba26a8a75a169f" exitCode=0 Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.401288 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pjhn7" event={"ID":"43fd6c78-af14-40ad-9fca-ce8a2d0370ba","Type":"ContainerDied","Data":"e55b868f3f91ca9a8d6da8bb998e9fa66ea93128ac3fa1b050ba26a8a75a169f"} Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.411855 4965 generic.go:334] "Generic (PLEG): container finished" podID="443303d8-fe0f-4af1-8ce8-0e99085d5e49" containerID="2882683b376024b3a00294d67797ac5fca63691b16ef26abf8edb6e8df7cd87e" exitCode=0 Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.411909 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea7f-account-create-m7smz" event={"ID":"443303d8-fe0f-4af1-8ce8-0e99085d5e49","Type":"ContainerDied","Data":"2882683b376024b3a00294d67797ac5fca63691b16ef26abf8edb6e8df7cd87e"} Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.424193 4965 generic.go:334] "Generic (PLEG): container finished" podID="ada44be1-c488-4ad3-bee0-fe9c0d7b84af" containerID="5947467abd4d280588c55bd5f26d68615c4ba3e0d5be62847489a19610b9b689" exitCode=0 Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.424241 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f6d8-account-create-mfxnx" event={"ID":"ada44be1-c488-4ad3-bee0-fe9c0d7b84af","Type":"ContainerDied","Data":"5947467abd4d280588c55bd5f26d68615c4ba3e0d5be62847489a19610b9b689"} Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.508501 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.574313703 podStartE2EDuration="7.508473663s" podCreationTimestamp="2025-11-25 15:27:46 +0000 UTC" firstStartedPulling="2025-11-25 15:27:47.846516046 +0000 UTC m=+1412.814109792" lastFinishedPulling="2025-11-25 15:27:52.780676006 +0000 UTC m=+1417.748269752" observedRunningTime="2025-11-25 15:27:53.498529812 +0000 UTC m=+1418.466123558" watchObservedRunningTime="2025-11-25 15:27:53.508473663 +0000 UTC m=+1418.476067409" Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.845365 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4f01-account-create-6rdbh" Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.922516 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dghj\" (UniqueName: \"kubernetes.io/projected/b2b31744-51a0-4e7e-8239-0ca59000796e-kube-api-access-8dghj\") pod \"b2b31744-51a0-4e7e-8239-0ca59000796e\" (UID: \"b2b31744-51a0-4e7e-8239-0ca59000796e\") " Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.922580 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b31744-51a0-4e7e-8239-0ca59000796e-operator-scripts\") pod \"b2b31744-51a0-4e7e-8239-0ca59000796e\" (UID: \"b2b31744-51a0-4e7e-8239-0ca59000796e\") " Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.924057 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2b31744-51a0-4e7e-8239-0ca59000796e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2b31744-51a0-4e7e-8239-0ca59000796e" (UID: "b2b31744-51a0-4e7e-8239-0ca59000796e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:53 crc kubenswrapper[4965]: I1125 15:27:53.931137 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b31744-51a0-4e7e-8239-0ca59000796e-kube-api-access-8dghj" (OuterVolumeSpecName: "kube-api-access-8dghj") pod "b2b31744-51a0-4e7e-8239-0ca59000796e" (UID: "b2b31744-51a0-4e7e-8239-0ca59000796e"). InnerVolumeSpecName "kube-api-access-8dghj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.025240 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dghj\" (UniqueName: \"kubernetes.io/projected/b2b31744-51a0-4e7e-8239-0ca59000796e-kube-api-access-8dghj\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.025271 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b31744-51a0-4e7e-8239-0ca59000796e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.440217 4965 generic.go:334] "Generic (PLEG): container finished" podID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerID="843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14" exitCode=0 Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.440546 4965 generic.go:334] "Generic (PLEG): container finished" podID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerID="d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e" exitCode=2 Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.440556 4965 generic.go:334] "Generic (PLEG): container finished" podID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerID="9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295" exitCode=0 Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.440309 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08e30ac4-238c-422b-bc13-aaeb4190ac38","Type":"ContainerDied","Data":"843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14"} Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.441154 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08e30ac4-238c-422b-bc13-aaeb4190ac38","Type":"ContainerDied","Data":"d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e"} Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.441169 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08e30ac4-238c-422b-bc13-aaeb4190ac38","Type":"ContainerDied","Data":"9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295"} Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.442699 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f","Type":"ContainerStarted","Data":"5f23d313c7d32b95d36d1b10a1fda61b3cb002706f09e99dd168bf817c1a86e1"} Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.442721 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f","Type":"ContainerStarted","Data":"c1f7ae67907e6575eb2f548566914ba2e896320fe36716db1322bd4428bb221c"} Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.444850 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4f01-account-create-6rdbh" Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.450121 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4f01-account-create-6rdbh" event={"ID":"b2b31744-51a0-4e7e-8239-0ca59000796e","Type":"ContainerDied","Data":"7c629a42f0da5335852ec9cccde843f6ba0da13483282a327be8e25fa8b1d2cf"} Nov 25 15:27:54 crc kubenswrapper[4965]: I1125 15:27:54.450181 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c629a42f0da5335852ec9cccde843f6ba0da13483282a327be8e25fa8b1d2cf" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.087755 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f6d8-account-create-mfxnx" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.148794 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd5tt\" (UniqueName: \"kubernetes.io/projected/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-kube-api-access-fd5tt\") pod \"ada44be1-c488-4ad3-bee0-fe9c0d7b84af\" (UID: \"ada44be1-c488-4ad3-bee0-fe9c0d7b84af\") " Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.149322 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-operator-scripts\") pod \"ada44be1-c488-4ad3-bee0-fe9c0d7b84af\" (UID: \"ada44be1-c488-4ad3-bee0-fe9c0d7b84af\") " Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.150527 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ada44be1-c488-4ad3-bee0-fe9c0d7b84af" (UID: "ada44be1-c488-4ad3-bee0-fe9c0d7b84af"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.155042 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-kube-api-access-fd5tt" (OuterVolumeSpecName: "kube-api-access-fd5tt") pod "ada44be1-c488-4ad3-bee0-fe9c0d7b84af" (UID: "ada44be1-c488-4ad3-bee0-fe9c0d7b84af"). InnerVolumeSpecName "kube-api-access-fd5tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.251907 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.251932 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd5tt\" (UniqueName: \"kubernetes.io/projected/ada44be1-c488-4ad3-bee0-fe9c0d7b84af-kube-api-access-fd5tt\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.289537 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea7f-account-create-m7smz" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.352616 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/443303d8-fe0f-4af1-8ce8-0e99085d5e49-operator-scripts\") pod \"443303d8-fe0f-4af1-8ce8-0e99085d5e49\" (UID: \"443303d8-fe0f-4af1-8ce8-0e99085d5e49\") " Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.352802 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfxbs\" (UniqueName: \"kubernetes.io/projected/443303d8-fe0f-4af1-8ce8-0e99085d5e49-kube-api-access-jfxbs\") pod \"443303d8-fe0f-4af1-8ce8-0e99085d5e49\" (UID: \"443303d8-fe0f-4af1-8ce8-0e99085d5e49\") " Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.353413 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443303d8-fe0f-4af1-8ce8-0e99085d5e49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "443303d8-fe0f-4af1-8ce8-0e99085d5e49" (UID: "443303d8-fe0f-4af1-8ce8-0e99085d5e49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.359369 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443303d8-fe0f-4af1-8ce8-0e99085d5e49-kube-api-access-jfxbs" (OuterVolumeSpecName: "kube-api-access-jfxbs") pod "443303d8-fe0f-4af1-8ce8-0e99085d5e49" (UID: "443303d8-fe0f-4af1-8ce8-0e99085d5e49"). InnerVolumeSpecName "kube-api-access-jfxbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.454608 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/443303d8-fe0f-4af1-8ce8-0e99085d5e49-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.454641 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfxbs\" (UniqueName: \"kubernetes.io/projected/443303d8-fe0f-4af1-8ce8-0e99085d5e49-kube-api-access-jfxbs\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.458376 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cszg8" event={"ID":"668762c3-71e3-42b7-974e-734b02cdbc1c","Type":"ContainerDied","Data":"3049b70d416287a563833660cc427a7e9657e7314a8a3a141d0718aee2dc0f55"} Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.458417 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3049b70d416287a563833660cc427a7e9657e7314a8a3a141d0718aee2dc0f55" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.460131 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pjhn7" event={"ID":"43fd6c78-af14-40ad-9fca-ce8a2d0370ba","Type":"ContainerDied","Data":"ffbbdb75b749481cd56f542e90f42a5b639e9f5d39e23ff71d19040e5beda1ca"} Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.460165 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffbbdb75b749481cd56f542e90f42a5b639e9f5d39e23ff71d19040e5beda1ca" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.461871 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea7f-account-create-m7smz" event={"ID":"443303d8-fe0f-4af1-8ce8-0e99085d5e49","Type":"ContainerDied","Data":"3e50b86bc8134df6d43dae6d3d2f49795d93954df272baf66df8b180beaca890"} Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.461908 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e50b86bc8134df6d43dae6d3d2f49795d93954df272baf66df8b180beaca890" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.461982 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea7f-account-create-m7smz" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.466956 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f6d8-account-create-mfxnx" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.466953 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f6d8-account-create-mfxnx" event={"ID":"ada44be1-c488-4ad3-bee0-fe9c0d7b84af","Type":"ContainerDied","Data":"b777ec741177c73853941fc2b408e91d6126cdb32f78d195216941ec0409d92e"} Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.467044 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b777ec741177c73853941fc2b408e91d6126cdb32f78d195216941ec0409d92e" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.473645 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mlzsx" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.475178 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a7b2938-42cd-4dee-b5fe-2c85b4bea92f","Type":"ContainerStarted","Data":"3465e3c434caf72a6009bbb85617162441ab02ec86589b7f0c5a633f7f0b06d9"} Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.476211 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.484055 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mlzsx" event={"ID":"f6a5305f-d6dc-4bf3-b356-4d278731672d","Type":"ContainerDied","Data":"8dd6a3d8bf7129173fb85bc745080fab40fa6d2d663f47e0b5f12ce268a30854"} Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.484109 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dd6a3d8bf7129173fb85bc745080fab40fa6d2d663f47e0b5f12ce268a30854" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.484350 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mlzsx" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.485520 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cszg8" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.501813 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pjhn7" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.557424 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65b8d\" (UniqueName: \"kubernetes.io/projected/668762c3-71e3-42b7-974e-734b02cdbc1c-kube-api-access-65b8d\") pod \"668762c3-71e3-42b7-974e-734b02cdbc1c\" (UID: \"668762c3-71e3-42b7-974e-734b02cdbc1c\") " Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.557702 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6a5305f-d6dc-4bf3-b356-4d278731672d-operator-scripts\") pod \"f6a5305f-d6dc-4bf3-b356-4d278731672d\" (UID: \"f6a5305f-d6dc-4bf3-b356-4d278731672d\") " Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.557897 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/668762c3-71e3-42b7-974e-734b02cdbc1c-operator-scripts\") pod \"668762c3-71e3-42b7-974e-734b02cdbc1c\" (UID: \"668762c3-71e3-42b7-974e-734b02cdbc1c\") " Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.557994 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqjjh\" (UniqueName: \"kubernetes.io/projected/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-kube-api-access-xqjjh\") pod \"43fd6c78-af14-40ad-9fca-ce8a2d0370ba\" (UID: \"43fd6c78-af14-40ad-9fca-ce8a2d0370ba\") " Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.558101 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr8dw\" (UniqueName: \"kubernetes.io/projected/f6a5305f-d6dc-4bf3-b356-4d278731672d-kube-api-access-sr8dw\") pod \"f6a5305f-d6dc-4bf3-b356-4d278731672d\" (UID: \"f6a5305f-d6dc-4bf3-b356-4d278731672d\") " Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.558202 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-operator-scripts\") pod \"43fd6c78-af14-40ad-9fca-ce8a2d0370ba\" (UID: \"43fd6c78-af14-40ad-9fca-ce8a2d0370ba\") " Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.558447 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a5305f-d6dc-4bf3-b356-4d278731672d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6a5305f-d6dc-4bf3-b356-4d278731672d" (UID: "f6a5305f-d6dc-4bf3-b356-4d278731672d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.558691 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6a5305f-d6dc-4bf3-b356-4d278731672d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.559798 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/668762c3-71e3-42b7-974e-734b02cdbc1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "668762c3-71e3-42b7-974e-734b02cdbc1c" (UID: "668762c3-71e3-42b7-974e-734b02cdbc1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.561386 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43fd6c78-af14-40ad-9fca-ce8a2d0370ba" (UID: "43fd6c78-af14-40ad-9fca-ce8a2d0370ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.563112 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.563095874 podStartE2EDuration="3.563095874s" podCreationTimestamp="2025-11-25 15:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:27:55.550910493 +0000 UTC m=+1420.518504239" watchObservedRunningTime="2025-11-25 15:27:55.563095874 +0000 UTC m=+1420.530689620" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.571546 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a5305f-d6dc-4bf3-b356-4d278731672d-kube-api-access-sr8dw" (OuterVolumeSpecName: "kube-api-access-sr8dw") pod "f6a5305f-d6dc-4bf3-b356-4d278731672d" (UID: "f6a5305f-d6dc-4bf3-b356-4d278731672d"). InnerVolumeSpecName "kube-api-access-sr8dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.571652 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-kube-api-access-xqjjh" (OuterVolumeSpecName: "kube-api-access-xqjjh") pod "43fd6c78-af14-40ad-9fca-ce8a2d0370ba" (UID: "43fd6c78-af14-40ad-9fca-ce8a2d0370ba"). InnerVolumeSpecName "kube-api-access-xqjjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.579168 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668762c3-71e3-42b7-974e-734b02cdbc1c-kube-api-access-65b8d" (OuterVolumeSpecName: "kube-api-access-65b8d") pod "668762c3-71e3-42b7-974e-734b02cdbc1c" (UID: "668762c3-71e3-42b7-974e-734b02cdbc1c"). InnerVolumeSpecName "kube-api-access-65b8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.660697 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/668762c3-71e3-42b7-974e-734b02cdbc1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.660761 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqjjh\" (UniqueName: \"kubernetes.io/projected/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-kube-api-access-xqjjh\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.660774 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr8dw\" (UniqueName: \"kubernetes.io/projected/f6a5305f-d6dc-4bf3-b356-4d278731672d-kube-api-access-sr8dw\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.660784 4965 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43fd6c78-af14-40ad-9fca-ce8a2d0370ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:55 crc kubenswrapper[4965]: I1125 15:27:55.660794 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65b8d\" (UniqueName: \"kubernetes.io/projected/668762c3-71e3-42b7-974e-734b02cdbc1c-kube-api-access-65b8d\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:56 crc kubenswrapper[4965]: I1125 15:27:56.491868 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pjhn7" Nov 25 15:27:56 crc kubenswrapper[4965]: I1125 15:27:56.494538 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cszg8" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.496240 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.501560 4965 generic.go:334] "Generic (PLEG): container finished" podID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerID="bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712" exitCode=0 Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.502524 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.502701 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08e30ac4-238c-422b-bc13-aaeb4190ac38","Type":"ContainerDied","Data":"bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712"} Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.502727 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08e30ac4-238c-422b-bc13-aaeb4190ac38","Type":"ContainerDied","Data":"f795c56bf927ebb642130969c1f508b17dd4540e427777dbe4aef62a30eced62"} Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.502743 4965 scope.go:117] "RemoveContainer" containerID="843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.539520 4965 scope.go:117] "RemoveContainer" containerID="d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.569985 4965 scope.go:117] "RemoveContainer" containerID="9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.595798 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-scripts\") pod \"08e30ac4-238c-422b-bc13-aaeb4190ac38\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.595892 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-run-httpd\") pod \"08e30ac4-238c-422b-bc13-aaeb4190ac38\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.595919 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-config-data\") pod \"08e30ac4-238c-422b-bc13-aaeb4190ac38\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.595955 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzwdc\" (UniqueName: \"kubernetes.io/projected/08e30ac4-238c-422b-bc13-aaeb4190ac38-kube-api-access-mzwdc\") pod \"08e30ac4-238c-422b-bc13-aaeb4190ac38\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.596028 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-log-httpd\") pod \"08e30ac4-238c-422b-bc13-aaeb4190ac38\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.596072 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-sg-core-conf-yaml\") pod \"08e30ac4-238c-422b-bc13-aaeb4190ac38\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.596167 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-combined-ca-bundle\") pod \"08e30ac4-238c-422b-bc13-aaeb4190ac38\" (UID: \"08e30ac4-238c-422b-bc13-aaeb4190ac38\") " Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.597948 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "08e30ac4-238c-422b-bc13-aaeb4190ac38" (UID: "08e30ac4-238c-422b-bc13-aaeb4190ac38"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.599403 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "08e30ac4-238c-422b-bc13-aaeb4190ac38" (UID: "08e30ac4-238c-422b-bc13-aaeb4190ac38"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.625597 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-scripts" (OuterVolumeSpecName: "scripts") pod "08e30ac4-238c-422b-bc13-aaeb4190ac38" (UID: "08e30ac4-238c-422b-bc13-aaeb4190ac38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.633304 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08e30ac4-238c-422b-bc13-aaeb4190ac38-kube-api-access-mzwdc" (OuterVolumeSpecName: "kube-api-access-mzwdc") pod "08e30ac4-238c-422b-bc13-aaeb4190ac38" (UID: "08e30ac4-238c-422b-bc13-aaeb4190ac38"). InnerVolumeSpecName "kube-api-access-mzwdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.636302 4965 scope.go:117] "RemoveContainer" containerID="bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.668261 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "08e30ac4-238c-422b-bc13-aaeb4190ac38" (UID: "08e30ac4-238c-422b-bc13-aaeb4190ac38"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.699501 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.699562 4965 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.699575 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzwdc\" (UniqueName: \"kubernetes.io/projected/08e30ac4-238c-422b-bc13-aaeb4190ac38-kube-api-access-mzwdc\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.699588 4965 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08e30ac4-238c-422b-bc13-aaeb4190ac38-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.699596 4965 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.703546 4965 scope.go:117] "RemoveContainer" containerID="843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.704556 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14\": container with ID starting with 843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14 not found: ID does not exist" containerID="843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.704597 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14"} err="failed to get container status \"843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14\": rpc error: code = NotFound desc = could not find container \"843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14\": container with ID starting with 843d9dffa3559868b441f345dde13822c13df0844af1cdb312b72c6aaac96e14 not found: ID does not exist" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.704640 4965 scope.go:117] "RemoveContainer" containerID="d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.706242 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e\": container with ID starting with d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e not found: ID does not exist" containerID="d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.706286 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e"} err="failed to get container status \"d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e\": rpc error: code = NotFound desc = could not find container \"d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e\": container with ID starting with d678bebf63bbfc0f00285bdc05880da6bb69660a1dfe68532c56eee8249dc26e not found: ID does not exist" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.706312 4965 scope.go:117] "RemoveContainer" containerID="9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.706722 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295\": container with ID starting with 9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295 not found: ID does not exist" containerID="9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.706747 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295"} err="failed to get container status \"9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295\": rpc error: code = NotFound desc = could not find container \"9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295\": container with ID starting with 9661a5a433bf3d8555b1e653430aaaa3b1fbb2e7f62d7cc31d70df9a7fc47295 not found: ID does not exist" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.706767 4965 scope.go:117] "RemoveContainer" containerID="bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.707080 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712\": container with ID starting with bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712 not found: ID does not exist" containerID="bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.707101 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712"} err="failed to get container status \"bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712\": rpc error: code = NotFound desc = could not find container \"bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712\": container with ID starting with bedf7013ef7af0a5b571fa83e62a77fba75ce1797235ea402674ab38abe8b712 not found: ID does not exist" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.711303 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-config-data" (OuterVolumeSpecName: "config-data") pod "08e30ac4-238c-422b-bc13-aaeb4190ac38" (UID: "08e30ac4-238c-422b-bc13-aaeb4190ac38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.735885 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08e30ac4-238c-422b-bc13-aaeb4190ac38" (UID: "08e30ac4-238c-422b-bc13-aaeb4190ac38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.801217 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.801467 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e30ac4-238c-422b-bc13-aaeb4190ac38-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.860797 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.873251 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.882919 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.883344 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="ceilometer-central-agent" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883368 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="ceilometer-central-agent" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.883387 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fd6c78-af14-40ad-9fca-ce8a2d0370ba" containerName="mariadb-database-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883394 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fd6c78-af14-40ad-9fca-ce8a2d0370ba" containerName="mariadb-database-create" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.883406 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="sg-core" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883413 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="sg-core" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.883428 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443303d8-fe0f-4af1-8ce8-0e99085d5e49" containerName="mariadb-account-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883435 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="443303d8-fe0f-4af1-8ce8-0e99085d5e49" containerName="mariadb-account-create" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.883449 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668762c3-71e3-42b7-974e-734b02cdbc1c" containerName="mariadb-database-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883456 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="668762c3-71e3-42b7-974e-734b02cdbc1c" containerName="mariadb-database-create" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.883499 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="ceilometer-notification-agent" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883507 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="ceilometer-notification-agent" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.883515 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada44be1-c488-4ad3-bee0-fe9c0d7b84af" containerName="mariadb-account-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883522 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada44be1-c488-4ad3-bee0-fe9c0d7b84af" containerName="mariadb-account-create" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.883534 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a5305f-d6dc-4bf3-b356-4d278731672d" containerName="mariadb-database-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883542 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a5305f-d6dc-4bf3-b356-4d278731672d" containerName="mariadb-database-create" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.883566 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="proxy-httpd" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883575 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="proxy-httpd" Nov 25 15:27:57 crc kubenswrapper[4965]: E1125 15:27:57.883597 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b31744-51a0-4e7e-8239-0ca59000796e" containerName="mariadb-account-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883604 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b31744-51a0-4e7e-8239-0ca59000796e" containerName="mariadb-account-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883804 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="proxy-httpd" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883816 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="ceilometer-central-agent" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883833 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada44be1-c488-4ad3-bee0-fe9c0d7b84af" containerName="mariadb-account-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883843 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b31744-51a0-4e7e-8239-0ca59000796e" containerName="mariadb-account-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883854 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="43fd6c78-af14-40ad-9fca-ce8a2d0370ba" containerName="mariadb-database-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883870 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="668762c3-71e3-42b7-974e-734b02cdbc1c" containerName="mariadb-database-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883884 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="sg-core" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883893 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a5305f-d6dc-4bf3-b356-4d278731672d" containerName="mariadb-database-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883906 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" containerName="ceilometer-notification-agent" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.883919 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="443303d8-fe0f-4af1-8ce8-0e99085d5e49" containerName="mariadb-account-create" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.885508 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.892399 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.892571 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 15:27:57 crc kubenswrapper[4965]: I1125 15:27:57.897013 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.005551 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.005912 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdvrs\" (UniqueName: \"kubernetes.io/projected/6af98be8-c639-4fe0-9efe-1caf076e412d-kube-api-access-fdvrs\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.005948 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-config-data\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.006012 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-scripts\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.006068 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-run-httpd\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.006108 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-log-httpd\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.006155 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.108210 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-run-httpd\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.108266 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-log-httpd\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.108323 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.108373 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.108407 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdvrs\" (UniqueName: \"kubernetes.io/projected/6af98be8-c639-4fe0-9efe-1caf076e412d-kube-api-access-fdvrs\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.108427 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-config-data\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.108466 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-scripts\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.109674 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-run-httpd\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.109672 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-log-httpd\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.112840 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-config-data\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.113052 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-scripts\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.113168 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.113437 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.128636 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdvrs\" (UniqueName: \"kubernetes.io/projected/6af98be8-c639-4fe0-9efe-1caf076e412d-kube-api-access-fdvrs\") pod \"ceilometer-0\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.207383 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.272827 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.705545 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:27:58 crc kubenswrapper[4965]: I1125 15:27:58.781728 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08e30ac4-238c-422b-bc13-aaeb4190ac38" path="/var/lib/kubelet/pods/08e30ac4-238c-422b-bc13-aaeb4190ac38/volumes" Nov 25 15:27:59 crc kubenswrapper[4965]: I1125 15:27:59.527788 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6af98be8-c639-4fe0-9efe-1caf076e412d","Type":"ContainerStarted","Data":"968bd9e0aba8d89f1532635e1e52f06eb72e09a2c8a6c16829bb8783f68a6a8b"} Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.477345 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hkzn5"] Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.480095 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.488406 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vz49h" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.488432 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.488601 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.522379 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hkzn5"] Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.572847 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6af98be8-c639-4fe0-9efe-1caf076e412d","Type":"ContainerStarted","Data":"68a4b5db79328b3786fad0ce6fe37af683b136ff4ba745625ce5fe0cf95698d5"} Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.577023 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-config-data\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.577104 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxkl4\" (UniqueName: \"kubernetes.io/projected/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-kube-api-access-fxkl4\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.577169 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-scripts\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.578332 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.679863 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.680124 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-config-data\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.680227 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxkl4\" (UniqueName: \"kubernetes.io/projected/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-kube-api-access-fxkl4\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.680326 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-scripts\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.686989 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-config-data\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.687422 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-scripts\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.699592 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.704500 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxkl4\" (UniqueName: \"kubernetes.io/projected/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-kube-api-access-fxkl4\") pod \"nova-cell0-conductor-db-sync-hkzn5\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:00 crc kubenswrapper[4965]: I1125 15:28:00.863909 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:28:01 crc kubenswrapper[4965]: I1125 15:28:01.358810 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hkzn5"] Nov 25 15:28:01 crc kubenswrapper[4965]: I1125 15:28:01.581417 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hkzn5" event={"ID":"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069","Type":"ContainerStarted","Data":"a22f5a9dccd39d66dc874c5cf4ecb7576190f4aa5b5637d20a7ee9929ebb8da9"} Nov 25 15:28:01 crc kubenswrapper[4965]: I1125 15:28:01.582834 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6af98be8-c639-4fe0-9efe-1caf076e412d","Type":"ContainerStarted","Data":"7111a949673bf767d4f0f661dfb5fbef7e2c5b1fc9150f59912cac2ba214d5d8"} Nov 25 15:28:02 crc kubenswrapper[4965]: I1125 15:28:02.605451 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6af98be8-c639-4fe0-9efe-1caf076e412d","Type":"ContainerStarted","Data":"8c6fe232014adf1790399b00a33b1bf24892e501421452877056eec4ca4ab486"} Nov 25 15:28:03 crc kubenswrapper[4965]: I1125 15:28:03.619107 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6af98be8-c639-4fe0-9efe-1caf076e412d","Type":"ContainerStarted","Data":"11c5603346cd5c4b36bfd7b45b80dd8ee933ba338fbdb91a8fa206c5b5f13cb0"} Nov 25 15:28:03 crc kubenswrapper[4965]: I1125 15:28:03.619809 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="ceilometer-central-agent" containerID="cri-o://68a4b5db79328b3786fad0ce6fe37af683b136ff4ba745625ce5fe0cf95698d5" gracePeriod=30 Nov 25 15:28:03 crc kubenswrapper[4965]: I1125 15:28:03.620142 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 15:28:03 crc kubenswrapper[4965]: I1125 15:28:03.620294 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="proxy-httpd" containerID="cri-o://11c5603346cd5c4b36bfd7b45b80dd8ee933ba338fbdb91a8fa206c5b5f13cb0" gracePeriod=30 Nov 25 15:28:03 crc kubenswrapper[4965]: I1125 15:28:03.620421 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="ceilometer-notification-agent" containerID="cri-o://7111a949673bf767d4f0f661dfb5fbef7e2c5b1fc9150f59912cac2ba214d5d8" gracePeriod=30 Nov 25 15:28:03 crc kubenswrapper[4965]: I1125 15:28:03.620450 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="sg-core" containerID="cri-o://8c6fe232014adf1790399b00a33b1bf24892e501421452877056eec4ca4ab486" gracePeriod=30 Nov 25 15:28:03 crc kubenswrapper[4965]: I1125 15:28:03.646440 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.68459827 podStartE2EDuration="6.646417083s" podCreationTimestamp="2025-11-25 15:27:57 +0000 UTC" firstStartedPulling="2025-11-25 15:27:58.717792573 +0000 UTC m=+1423.685386319" lastFinishedPulling="2025-11-25 15:28:02.679611386 +0000 UTC m=+1427.647205132" observedRunningTime="2025-11-25 15:28:03.638658882 +0000 UTC m=+1428.606252628" watchObservedRunningTime="2025-11-25 15:28:03.646417083 +0000 UTC m=+1428.614010839" Nov 25 15:28:04 crc kubenswrapper[4965]: I1125 15:28:04.608393 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 25 15:28:04 crc kubenswrapper[4965]: I1125 15:28:04.664036 4965 generic.go:334] "Generic (PLEG): container finished" podID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerID="11c5603346cd5c4b36bfd7b45b80dd8ee933ba338fbdb91a8fa206c5b5f13cb0" exitCode=0 Nov 25 15:28:04 crc kubenswrapper[4965]: I1125 15:28:04.664072 4965 generic.go:334] "Generic (PLEG): container finished" podID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerID="8c6fe232014adf1790399b00a33b1bf24892e501421452877056eec4ca4ab486" exitCode=2 Nov 25 15:28:04 crc kubenswrapper[4965]: I1125 15:28:04.664082 4965 generic.go:334] "Generic (PLEG): container finished" podID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerID="7111a949673bf767d4f0f661dfb5fbef7e2c5b1fc9150f59912cac2ba214d5d8" exitCode=0 Nov 25 15:28:04 crc kubenswrapper[4965]: I1125 15:28:04.664105 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6af98be8-c639-4fe0-9efe-1caf076e412d","Type":"ContainerDied","Data":"11c5603346cd5c4b36bfd7b45b80dd8ee933ba338fbdb91a8fa206c5b5f13cb0"} Nov 25 15:28:04 crc kubenswrapper[4965]: I1125 15:28:04.664142 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6af98be8-c639-4fe0-9efe-1caf076e412d","Type":"ContainerDied","Data":"8c6fe232014adf1790399b00a33b1bf24892e501421452877056eec4ca4ab486"} Nov 25 15:28:04 crc kubenswrapper[4965]: I1125 15:28:04.664154 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6af98be8-c639-4fe0-9efe-1caf076e412d","Type":"ContainerDied","Data":"7111a949673bf767d4f0f661dfb5fbef7e2c5b1fc9150f59912cac2ba214d5d8"} Nov 25 15:28:10 crc kubenswrapper[4965]: I1125 15:28:10.714176 4965 generic.go:334] "Generic (PLEG): container finished" podID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerID="68a4b5db79328b3786fad0ce6fe37af683b136ff4ba745625ce5fe0cf95698d5" exitCode=0 Nov 25 15:28:10 crc kubenswrapper[4965]: I1125 15:28:10.714284 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6af98be8-c639-4fe0-9efe-1caf076e412d","Type":"ContainerDied","Data":"68a4b5db79328b3786fad0ce6fe37af683b136ff4ba745625ce5fe0cf95698d5"} Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.056092 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.236075 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-config-data\") pod \"6af98be8-c639-4fe0-9efe-1caf076e412d\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.236154 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-combined-ca-bundle\") pod \"6af98be8-c639-4fe0-9efe-1caf076e412d\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.236206 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-scripts\") pod \"6af98be8-c639-4fe0-9efe-1caf076e412d\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.236284 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-log-httpd\") pod \"6af98be8-c639-4fe0-9efe-1caf076e412d\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.236314 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-run-httpd\") pod \"6af98be8-c639-4fe0-9efe-1caf076e412d\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.236414 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdvrs\" (UniqueName: \"kubernetes.io/projected/6af98be8-c639-4fe0-9efe-1caf076e412d-kube-api-access-fdvrs\") pod \"6af98be8-c639-4fe0-9efe-1caf076e412d\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.236486 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-sg-core-conf-yaml\") pod \"6af98be8-c639-4fe0-9efe-1caf076e412d\" (UID: \"6af98be8-c639-4fe0-9efe-1caf076e412d\") " Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.237244 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6af98be8-c639-4fe0-9efe-1caf076e412d" (UID: "6af98be8-c639-4fe0-9efe-1caf076e412d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.237749 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6af98be8-c639-4fe0-9efe-1caf076e412d" (UID: "6af98be8-c639-4fe0-9efe-1caf076e412d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.241561 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-scripts" (OuterVolumeSpecName: "scripts") pod "6af98be8-c639-4fe0-9efe-1caf076e412d" (UID: "6af98be8-c639-4fe0-9efe-1caf076e412d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.242941 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af98be8-c639-4fe0-9efe-1caf076e412d-kube-api-access-fdvrs" (OuterVolumeSpecName: "kube-api-access-fdvrs") pod "6af98be8-c639-4fe0-9efe-1caf076e412d" (UID: "6af98be8-c639-4fe0-9efe-1caf076e412d"). InnerVolumeSpecName "kube-api-access-fdvrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.269918 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6af98be8-c639-4fe0-9efe-1caf076e412d" (UID: "6af98be8-c639-4fe0-9efe-1caf076e412d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.329521 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6af98be8-c639-4fe0-9efe-1caf076e412d" (UID: "6af98be8-c639-4fe0-9efe-1caf076e412d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.338097 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdvrs\" (UniqueName: \"kubernetes.io/projected/6af98be8-c639-4fe0-9efe-1caf076e412d-kube-api-access-fdvrs\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.338246 4965 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.338335 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.338401 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.338524 4965 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.338578 4965 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af98be8-c639-4fe0-9efe-1caf076e412d-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.350837 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-config-data" (OuterVolumeSpecName: "config-data") pod "6af98be8-c639-4fe0-9efe-1caf076e412d" (UID: "6af98be8-c639-4fe0-9efe-1caf076e412d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.440331 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af98be8-c639-4fe0-9efe-1caf076e412d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.739993 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hkzn5" event={"ID":"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069","Type":"ContainerStarted","Data":"334477545a2b33183c8778624e141b60ac0c37b3664304d1268e5cef3619157a"} Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.743021 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6af98be8-c639-4fe0-9efe-1caf076e412d","Type":"ContainerDied","Data":"968bd9e0aba8d89f1532635e1e52f06eb72e09a2c8a6c16829bb8783f68a6a8b"} Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.743451 4965 scope.go:117] "RemoveContainer" containerID="11c5603346cd5c4b36bfd7b45b80dd8ee933ba338fbdb91a8fa206c5b5f13cb0" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.743595 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.783721 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-hkzn5" podStartSLOduration=2.346341947 podStartE2EDuration="13.783700374s" podCreationTimestamp="2025-11-25 15:28:00 +0000 UTC" firstStartedPulling="2025-11-25 15:28:01.3611044 +0000 UTC m=+1426.328698136" lastFinishedPulling="2025-11-25 15:28:12.798462817 +0000 UTC m=+1437.766056563" observedRunningTime="2025-11-25 15:28:13.772083578 +0000 UTC m=+1438.739677324" watchObservedRunningTime="2025-11-25 15:28:13.783700374 +0000 UTC m=+1438.751294120" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.804142 4965 scope.go:117] "RemoveContainer" containerID="8c6fe232014adf1790399b00a33b1bf24892e501421452877056eec4ca4ab486" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.808022 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.816168 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.837400 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:13 crc kubenswrapper[4965]: E1125 15:28:13.837821 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="proxy-httpd" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.837838 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="proxy-httpd" Nov 25 15:28:13 crc kubenswrapper[4965]: E1125 15:28:13.837861 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="sg-core" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.837898 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="sg-core" Nov 25 15:28:13 crc kubenswrapper[4965]: E1125 15:28:13.837916 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="ceilometer-central-agent" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.837924 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="ceilometer-central-agent" Nov 25 15:28:13 crc kubenswrapper[4965]: E1125 15:28:13.837945 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="ceilometer-notification-agent" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.837952 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="ceilometer-notification-agent" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.838248 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="ceilometer-central-agent" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.838269 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="proxy-httpd" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.838284 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="sg-core" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.838306 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" containerName="ceilometer-notification-agent" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.840223 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.842305 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.842600 4965 scope.go:117] "RemoveContainer" containerID="7111a949673bf767d4f0f661dfb5fbef7e2c5b1fc9150f59912cac2ba214d5d8" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.842987 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.849676 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.877542 4965 scope.go:117] "RemoveContainer" containerID="68a4b5db79328b3786fad0ce6fe37af683b136ff4ba745625ce5fe0cf95698d5" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.947934 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.948038 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-config-data\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.948101 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-run-httpd\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.948169 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.948219 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-scripts\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.948254 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-log-httpd\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:13 crc kubenswrapper[4965]: I1125 15:28:13.948279 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqqrp\" (UniqueName: \"kubernetes.io/projected/667d5c57-3e9d-420f-920d-19b56f49f631-kube-api-access-bqqrp\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.050196 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.050243 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-run-httpd\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.050278 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-scripts\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.050305 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-log-httpd\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.050323 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqqrp\" (UniqueName: \"kubernetes.io/projected/667d5c57-3e9d-420f-920d-19b56f49f631-kube-api-access-bqqrp\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.050381 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.050432 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-config-data\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.051456 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-log-httpd\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.051778 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-run-httpd\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.054763 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-scripts\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.056174 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.056715 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-config-data\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.057482 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.071802 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqqrp\" (UniqueName: \"kubernetes.io/projected/667d5c57-3e9d-420f-920d-19b56f49f631-kube-api-access-bqqrp\") pod \"ceilometer-0\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.162872 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.684928 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.754663 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"667d5c57-3e9d-420f-920d-19b56f49f631","Type":"ContainerStarted","Data":"33f9b33001aced0315728ac1dca3dba9f3e58ccbbd7cf2dee9465e9f6e3ddd59"} Nov 25 15:28:14 crc kubenswrapper[4965]: I1125 15:28:14.797781 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af98be8-c639-4fe0-9efe-1caf076e412d" path="/var/lib/kubelet/pods/6af98be8-c639-4fe0-9efe-1caf076e412d/volumes" Nov 25 15:28:15 crc kubenswrapper[4965]: I1125 15:28:15.766131 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"667d5c57-3e9d-420f-920d-19b56f49f631","Type":"ContainerStarted","Data":"fb13664b038f9d5e6bf785b1b1729ee2e91f6b088a42b98153da4673d7d125ab"} Nov 25 15:28:16 crc kubenswrapper[4965]: I1125 15:28:16.813220 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"667d5c57-3e9d-420f-920d-19b56f49f631","Type":"ContainerStarted","Data":"ba033632c728dd0252e1ca493623b671aea851d776b80a83b075e5ba59084f63"} Nov 25 15:28:16 crc kubenswrapper[4965]: I1125 15:28:16.814604 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"667d5c57-3e9d-420f-920d-19b56f49f631","Type":"ContainerStarted","Data":"96c585ba1935b0fcd6b1fe67a9ad6e56445af68049485c248a8939bd0ee9a2d2"} Nov 25 15:28:18 crc kubenswrapper[4965]: I1125 15:28:18.835903 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"667d5c57-3e9d-420f-920d-19b56f49f631","Type":"ContainerStarted","Data":"4e46236be23334f1e7c97853d011d71ae3d5c1a51029c2dee0f0a155af913150"} Nov 25 15:28:18 crc kubenswrapper[4965]: I1125 15:28:18.836551 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 15:28:18 crc kubenswrapper[4965]: I1125 15:28:18.897504 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6885188319999997 podStartE2EDuration="5.897485287s" podCreationTimestamp="2025-11-25 15:28:13 +0000 UTC" firstStartedPulling="2025-11-25 15:28:14.694024364 +0000 UTC m=+1439.661618110" lastFinishedPulling="2025-11-25 15:28:17.902990829 +0000 UTC m=+1442.870584565" observedRunningTime="2025-11-25 15:28:18.863876354 +0000 UTC m=+1443.831470110" watchObservedRunningTime="2025-11-25 15:28:18.897485287 +0000 UTC m=+1443.865079033" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.061858 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wlwlh"] Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.065291 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.075253 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlwlh"] Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.235909 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-catalog-content\") pod \"redhat-operators-wlwlh\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.235960 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zpzx\" (UniqueName: \"kubernetes.io/projected/d35a1980-655b-4a74-a033-0d77c5f6f4a1-kube-api-access-4zpzx\") pod \"redhat-operators-wlwlh\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.236378 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-utilities\") pod \"redhat-operators-wlwlh\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.337844 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-catalog-content\") pod \"redhat-operators-wlwlh\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.337950 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zpzx\" (UniqueName: \"kubernetes.io/projected/d35a1980-655b-4a74-a033-0d77c5f6f4a1-kube-api-access-4zpzx\") pod \"redhat-operators-wlwlh\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.338060 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-utilities\") pod \"redhat-operators-wlwlh\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.338582 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-catalog-content\") pod \"redhat-operators-wlwlh\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.338590 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-utilities\") pod \"redhat-operators-wlwlh\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.362770 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zpzx\" (UniqueName: \"kubernetes.io/projected/d35a1980-655b-4a74-a033-0d77c5f6f4a1-kube-api-access-4zpzx\") pod \"redhat-operators-wlwlh\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.397256 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:28:31 crc kubenswrapper[4965]: I1125 15:28:31.994155 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlwlh"] Nov 25 15:28:32 crc kubenswrapper[4965]: W1125 15:28:32.014135 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd35a1980_655b_4a74_a033_0d77c5f6f4a1.slice/crio-1ab1361031bc886109e6fc8ed89c1d956a15b252980347854d3d8cf63ba25916 WatchSource:0}: Error finding container 1ab1361031bc886109e6fc8ed89c1d956a15b252980347854d3d8cf63ba25916: Status 404 returned error can't find the container with id 1ab1361031bc886109e6fc8ed89c1d956a15b252980347854d3d8cf63ba25916 Nov 25 15:28:32 crc kubenswrapper[4965]: I1125 15:28:32.964902 4965 generic.go:334] "Generic (PLEG): container finished" podID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerID="7c81b4925b12b054a176e904f6fc38d5cc55515357cf1eb3bf0656c597ee652f" exitCode=0 Nov 25 15:28:32 crc kubenswrapper[4965]: I1125 15:28:32.965061 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlwlh" event={"ID":"d35a1980-655b-4a74-a033-0d77c5f6f4a1","Type":"ContainerDied","Data":"7c81b4925b12b054a176e904f6fc38d5cc55515357cf1eb3bf0656c597ee652f"} Nov 25 15:28:32 crc kubenswrapper[4965]: I1125 15:28:32.965221 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlwlh" event={"ID":"d35a1980-655b-4a74-a033-0d77c5f6f4a1","Type":"ContainerStarted","Data":"1ab1361031bc886109e6fc8ed89c1d956a15b252980347854d3d8cf63ba25916"} Nov 25 15:28:36 crc kubenswrapper[4965]: I1125 15:28:36.021335 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlwlh" event={"ID":"d35a1980-655b-4a74-a033-0d77c5f6f4a1","Type":"ContainerStarted","Data":"624ab4f03f6482c3d9acf9b9f418a5c0ce370eb7b70c8cb982b1fb6dbe2f41cd"} Nov 25 15:28:44 crc kubenswrapper[4965]: I1125 15:28:44.512873 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 15:28:48 crc kubenswrapper[4965]: I1125 15:28:48.707742 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 15:28:48 crc kubenswrapper[4965]: I1125 15:28:48.708404 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8bccea83-ab65-40e5-943f-f35e98b7618c" containerName="kube-state-metrics" containerID="cri-o://ae3899ee6e0566d23cc54c8f00e8b7e62adc98332699fc01301c8265345c1008" gracePeriod=30 Nov 25 15:28:49 crc kubenswrapper[4965]: I1125 15:28:49.138391 4965 generic.go:334] "Generic (PLEG): container finished" podID="8bccea83-ab65-40e5-943f-f35e98b7618c" containerID="ae3899ee6e0566d23cc54c8f00e8b7e62adc98332699fc01301c8265345c1008" exitCode=2 Nov 25 15:28:49 crc kubenswrapper[4965]: I1125 15:28:49.138524 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8bccea83-ab65-40e5-943f-f35e98b7618c","Type":"ContainerDied","Data":"ae3899ee6e0566d23cc54c8f00e8b7e62adc98332699fc01301c8265345c1008"} Nov 25 15:28:49 crc kubenswrapper[4965]: I1125 15:28:49.141723 4965 generic.go:334] "Generic (PLEG): container finished" podID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerID="624ab4f03f6482c3d9acf9b9f418a5c0ce370eb7b70c8cb982b1fb6dbe2f41cd" exitCode=0 Nov 25 15:28:49 crc kubenswrapper[4965]: I1125 15:28:49.141759 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlwlh" event={"ID":"d35a1980-655b-4a74-a033-0d77c5f6f4a1","Type":"ContainerDied","Data":"624ab4f03f6482c3d9acf9b9f418a5c0ce370eb7b70c8cb982b1fb6dbe2f41cd"} Nov 25 15:28:49 crc kubenswrapper[4965]: I1125 15:28:49.745497 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:49 crc kubenswrapper[4965]: I1125 15:28:49.745986 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="sg-core" containerID="cri-o://ba033632c728dd0252e1ca493623b671aea851d776b80a83b075e5ba59084f63" gracePeriod=30 Nov 25 15:28:49 crc kubenswrapper[4965]: I1125 15:28:49.746154 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="ceilometer-notification-agent" containerID="cri-o://96c585ba1935b0fcd6b1fe67a9ad6e56445af68049485c248a8939bd0ee9a2d2" gracePeriod=30 Nov 25 15:28:49 crc kubenswrapper[4965]: I1125 15:28:49.746261 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="proxy-httpd" containerID="cri-o://4e46236be23334f1e7c97853d011d71ae3d5c1a51029c2dee0f0a155af913150" gracePeriod=30 Nov 25 15:28:49 crc kubenswrapper[4965]: I1125 15:28:49.745800 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="ceilometer-central-agent" containerID="cri-o://fb13664b038f9d5e6bf785b1b1729ee2e91f6b088a42b98153da4673d7d125ab" gracePeriod=30 Nov 25 15:28:51 crc kubenswrapper[4965]: I1125 15:28:51.229109 4965 generic.go:334] "Generic (PLEG): container finished" podID="667d5c57-3e9d-420f-920d-19b56f49f631" containerID="ba033632c728dd0252e1ca493623b671aea851d776b80a83b075e5ba59084f63" exitCode=2 Nov 25 15:28:51 crc kubenswrapper[4965]: I1125 15:28:51.229470 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"667d5c57-3e9d-420f-920d-19b56f49f631","Type":"ContainerDied","Data":"ba033632c728dd0252e1ca493623b671aea851d776b80a83b075e5ba59084f63"} Nov 25 15:28:51 crc kubenswrapper[4965]: I1125 15:28:51.968797 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.105059 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n8qq\" (UniqueName: \"kubernetes.io/projected/8bccea83-ab65-40e5-943f-f35e98b7618c-kube-api-access-4n8qq\") pod \"8bccea83-ab65-40e5-943f-f35e98b7618c\" (UID: \"8bccea83-ab65-40e5-943f-f35e98b7618c\") " Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.114979 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bccea83-ab65-40e5-943f-f35e98b7618c-kube-api-access-4n8qq" (OuterVolumeSpecName: "kube-api-access-4n8qq") pod "8bccea83-ab65-40e5-943f-f35e98b7618c" (UID: "8bccea83-ab65-40e5-943f-f35e98b7618c"). InnerVolumeSpecName "kube-api-access-4n8qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.207704 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n8qq\" (UniqueName: \"kubernetes.io/projected/8bccea83-ab65-40e5-943f-f35e98b7618c-kube-api-access-4n8qq\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.241903 4965 generic.go:334] "Generic (PLEG): container finished" podID="667d5c57-3e9d-420f-920d-19b56f49f631" containerID="4e46236be23334f1e7c97853d011d71ae3d5c1a51029c2dee0f0a155af913150" exitCode=0 Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.241937 4965 generic.go:334] "Generic (PLEG): container finished" podID="667d5c57-3e9d-420f-920d-19b56f49f631" containerID="fb13664b038f9d5e6bf785b1b1729ee2e91f6b088a42b98153da4673d7d125ab" exitCode=0 Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.241993 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"667d5c57-3e9d-420f-920d-19b56f49f631","Type":"ContainerDied","Data":"4e46236be23334f1e7c97853d011d71ae3d5c1a51029c2dee0f0a155af913150"} Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.242038 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"667d5c57-3e9d-420f-920d-19b56f49f631","Type":"ContainerDied","Data":"fb13664b038f9d5e6bf785b1b1729ee2e91f6b088a42b98153da4673d7d125ab"} Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.244558 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8bccea83-ab65-40e5-943f-f35e98b7618c","Type":"ContainerDied","Data":"6a0fbb40a11945386d67f20755348f768744d58ddd7c480a24aa566fa312bc22"} Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.244597 4965 scope.go:117] "RemoveContainer" containerID="ae3899ee6e0566d23cc54c8f00e8b7e62adc98332699fc01301c8265345c1008" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.244602 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.300416 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.311322 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.318707 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 15:28:52 crc kubenswrapper[4965]: E1125 15:28:52.319233 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bccea83-ab65-40e5-943f-f35e98b7618c" containerName="kube-state-metrics" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.319251 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bccea83-ab65-40e5-943f-f35e98b7618c" containerName="kube-state-metrics" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.319543 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bccea83-ab65-40e5-943f-f35e98b7618c" containerName="kube-state-metrics" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.320376 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.326859 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.354369 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.354499 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.410346 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.410407 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.410427 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.410504 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4m42\" (UniqueName: \"kubernetes.io/projected/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-kube-api-access-j4m42\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: E1125 15:28:52.436805 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bccea83_ab65_40e5_943f_f35e98b7618c.slice\": RecentStats: unable to find data in memory cache]" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.512465 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.512529 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.512550 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.512606 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4m42\" (UniqueName: \"kubernetes.io/projected/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-kube-api-access-j4m42\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.516469 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.524616 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.524908 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.536118 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4m42\" (UniqueName: \"kubernetes.io/projected/54bc32ba-400a-4cc6-a7a0-c03eb66edd9d-kube-api-access-j4m42\") pod \"kube-state-metrics-0\" (UID: \"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d\") " pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.670324 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 15:28:52 crc kubenswrapper[4965]: I1125 15:28:52.781300 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bccea83-ab65-40e5-943f-f35e98b7618c" path="/var/lib/kubelet/pods/8bccea83-ab65-40e5-943f-f35e98b7618c/volumes" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.177122 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.253855 4965 generic.go:334] "Generic (PLEG): container finished" podID="667d5c57-3e9d-420f-920d-19b56f49f631" containerID="96c585ba1935b0fcd6b1fe67a9ad6e56445af68049485c248a8939bd0ee9a2d2" exitCode=0 Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.253909 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"667d5c57-3e9d-420f-920d-19b56f49f631","Type":"ContainerDied","Data":"96c585ba1935b0fcd6b1fe67a9ad6e56445af68049485c248a8939bd0ee9a2d2"} Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.255417 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d","Type":"ContainerStarted","Data":"03f1655e0eefa8ad078445e223dc055758f9f02511e3469a198e94618f2b701d"} Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.728399 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.840400 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqqrp\" (UniqueName: \"kubernetes.io/projected/667d5c57-3e9d-420f-920d-19b56f49f631-kube-api-access-bqqrp\") pod \"667d5c57-3e9d-420f-920d-19b56f49f631\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.840752 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-run-httpd\") pod \"667d5c57-3e9d-420f-920d-19b56f49f631\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.840893 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-log-httpd\") pod \"667d5c57-3e9d-420f-920d-19b56f49f631\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.840943 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-sg-core-conf-yaml\") pod \"667d5c57-3e9d-420f-920d-19b56f49f631\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.840984 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-combined-ca-bundle\") pod \"667d5c57-3e9d-420f-920d-19b56f49f631\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.840999 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "667d5c57-3e9d-420f-920d-19b56f49f631" (UID: "667d5c57-3e9d-420f-920d-19b56f49f631"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.841025 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-scripts\") pod \"667d5c57-3e9d-420f-920d-19b56f49f631\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.841079 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-config-data\") pod \"667d5c57-3e9d-420f-920d-19b56f49f631\" (UID: \"667d5c57-3e9d-420f-920d-19b56f49f631\") " Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.841436 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "667d5c57-3e9d-420f-920d-19b56f49f631" (UID: "667d5c57-3e9d-420f-920d-19b56f49f631"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.841892 4965 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.841907 4965 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/667d5c57-3e9d-420f-920d-19b56f49f631-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.846050 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667d5c57-3e9d-420f-920d-19b56f49f631-kube-api-access-bqqrp" (OuterVolumeSpecName: "kube-api-access-bqqrp") pod "667d5c57-3e9d-420f-920d-19b56f49f631" (UID: "667d5c57-3e9d-420f-920d-19b56f49f631"). InnerVolumeSpecName "kube-api-access-bqqrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.846497 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-scripts" (OuterVolumeSpecName: "scripts") pod "667d5c57-3e9d-420f-920d-19b56f49f631" (UID: "667d5c57-3e9d-420f-920d-19b56f49f631"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.870520 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "667d5c57-3e9d-420f-920d-19b56f49f631" (UID: "667d5c57-3e9d-420f-920d-19b56f49f631"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.911484 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "667d5c57-3e9d-420f-920d-19b56f49f631" (UID: "667d5c57-3e9d-420f-920d-19b56f49f631"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.943192 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.943247 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqqrp\" (UniqueName: \"kubernetes.io/projected/667d5c57-3e9d-420f-920d-19b56f49f631-kube-api-access-bqqrp\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.943262 4965 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.943273 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:53 crc kubenswrapper[4965]: I1125 15:28:53.961178 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-config-data" (OuterVolumeSpecName: "config-data") pod "667d5c57-3e9d-420f-920d-19b56f49f631" (UID: "667d5c57-3e9d-420f-920d-19b56f49f631"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.044360 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667d5c57-3e9d-420f-920d-19b56f49f631-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.267676 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.267710 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"667d5c57-3e9d-420f-920d-19b56f49f631","Type":"ContainerDied","Data":"33f9b33001aced0315728ac1dca3dba9f3e58ccbbd7cf2dee9465e9f6e3ddd59"} Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.267778 4965 scope.go:117] "RemoveContainer" containerID="4e46236be23334f1e7c97853d011d71ae3d5c1a51029c2dee0f0a155af913150" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.270797 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlwlh" event={"ID":"d35a1980-655b-4a74-a033-0d77c5f6f4a1","Type":"ContainerStarted","Data":"b90142a02e2520046ff474ba3f5e2949b5bdc01e000f9add863afbd6d147ed0e"} Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.304100 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wlwlh" podStartSLOduration=2.998455692 podStartE2EDuration="23.30407916s" podCreationTimestamp="2025-11-25 15:28:31 +0000 UTC" firstStartedPulling="2025-11-25 15:28:32.967129822 +0000 UTC m=+1457.934723558" lastFinishedPulling="2025-11-25 15:28:53.27275328 +0000 UTC m=+1478.240347026" observedRunningTime="2025-11-25 15:28:54.296121714 +0000 UTC m=+1479.263715470" watchObservedRunningTime="2025-11-25 15:28:54.30407916 +0000 UTC m=+1479.271672906" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.354436 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.365437 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.371241 4965 scope.go:117] "RemoveContainer" containerID="ba033632c728dd0252e1ca493623b671aea851d776b80a83b075e5ba59084f63" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.375653 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:54 crc kubenswrapper[4965]: E1125 15:28:54.376019 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="ceilometer-central-agent" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.376032 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="ceilometer-central-agent" Nov 25 15:28:54 crc kubenswrapper[4965]: E1125 15:28:54.376038 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="proxy-httpd" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.376045 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="proxy-httpd" Nov 25 15:28:54 crc kubenswrapper[4965]: E1125 15:28:54.376062 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="sg-core" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.376068 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="sg-core" Nov 25 15:28:54 crc kubenswrapper[4965]: E1125 15:28:54.376079 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="ceilometer-notification-agent" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.376086 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="ceilometer-notification-agent" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.376286 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="proxy-httpd" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.376316 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="ceilometer-notification-agent" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.376327 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="sg-core" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.376342 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" containerName="ceilometer-central-agent" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.404245 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.404345 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.409458 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.412422 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.412649 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.458818 4965 scope.go:117] "RemoveContainer" containerID="96c585ba1935b0fcd6b1fe67a9ad6e56445af68049485c248a8939bd0ee9a2d2" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.459835 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-config-data\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.459867 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-run-httpd\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.459890 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.459914 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.459957 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-scripts\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.460038 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29c26\" (UniqueName: \"kubernetes.io/projected/79bce5f5-733f-4651-8fa2-6756ff1993ee-kube-api-access-29c26\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.460061 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-log-httpd\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.460091 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.491264 4965 scope.go:117] "RemoveContainer" containerID="fb13664b038f9d5e6bf785b1b1729ee2e91f6b088a42b98153da4673d7d125ab" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.562161 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-config-data\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.562221 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-run-httpd\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.562253 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.562291 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.562351 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-scripts\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.562397 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29c26\" (UniqueName: \"kubernetes.io/projected/79bce5f5-733f-4651-8fa2-6756ff1993ee-kube-api-access-29c26\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.562426 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-log-httpd\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.562462 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.563635 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-log-httpd\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.563894 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-run-httpd\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.567427 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-scripts\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.572142 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.572777 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.575503 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.578155 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-config-data\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.578518 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29c26\" (UniqueName: \"kubernetes.io/projected/79bce5f5-733f-4651-8fa2-6756ff1993ee-kube-api-access-29c26\") pod \"ceilometer-0\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.744073 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:28:54 crc kubenswrapper[4965]: I1125 15:28:54.786453 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="667d5c57-3e9d-420f-920d-19b56f49f631" path="/var/lib/kubelet/pods/667d5c57-3e9d-420f-920d-19b56f49f631/volumes" Nov 25 15:28:55 crc kubenswrapper[4965]: I1125 15:28:55.207854 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:28:55 crc kubenswrapper[4965]: W1125 15:28:55.223580 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79bce5f5_733f_4651_8fa2_6756ff1993ee.slice/crio-9372651e62955a133413d6cfaadb5e3fce62b8424b268c70d89f2fada9953efe WatchSource:0}: Error finding container 9372651e62955a133413d6cfaadb5e3fce62b8424b268c70d89f2fada9953efe: Status 404 returned error can't find the container with id 9372651e62955a133413d6cfaadb5e3fce62b8424b268c70d89f2fada9953efe Nov 25 15:28:55 crc kubenswrapper[4965]: I1125 15:28:55.286684 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"54bc32ba-400a-4cc6-a7a0-c03eb66edd9d","Type":"ContainerStarted","Data":"cc68c907eccd48ba081cf0ccf6a707b6d59cf5e9a74de40643147226d455aaca"} Nov 25 15:28:55 crc kubenswrapper[4965]: I1125 15:28:55.286996 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 15:28:55 crc kubenswrapper[4965]: I1125 15:28:55.292185 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79bce5f5-733f-4651-8fa2-6756ff1993ee","Type":"ContainerStarted","Data":"9372651e62955a133413d6cfaadb5e3fce62b8424b268c70d89f2fada9953efe"} Nov 25 15:28:55 crc kubenswrapper[4965]: I1125 15:28:55.307184 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.178937068 podStartE2EDuration="3.307162212s" podCreationTimestamp="2025-11-25 15:28:52 +0000 UTC" firstStartedPulling="2025-11-25 15:28:53.242892279 +0000 UTC m=+1478.210486025" lastFinishedPulling="2025-11-25 15:28:54.371117423 +0000 UTC m=+1479.338711169" observedRunningTime="2025-11-25 15:28:55.303221514 +0000 UTC m=+1480.270815260" watchObservedRunningTime="2025-11-25 15:28:55.307162212 +0000 UTC m=+1480.274755958" Nov 25 15:28:57 crc kubenswrapper[4965]: I1125 15:28:57.310081 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79bce5f5-733f-4651-8fa2-6756ff1993ee","Type":"ContainerStarted","Data":"f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118"} Nov 25 15:28:59 crc kubenswrapper[4965]: I1125 15:28:59.325428 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79bce5f5-733f-4651-8fa2-6756ff1993ee","Type":"ContainerStarted","Data":"28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93"} Nov 25 15:29:01 crc kubenswrapper[4965]: I1125 15:29:01.345476 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79bce5f5-733f-4651-8fa2-6756ff1993ee","Type":"ContainerStarted","Data":"cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49"} Nov 25 15:29:01 crc kubenswrapper[4965]: I1125 15:29:01.397403 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:29:01 crc kubenswrapper[4965]: I1125 15:29:01.398310 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:29:01 crc kubenswrapper[4965]: I1125 15:29:01.445305 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:29:02 crc kubenswrapper[4965]: I1125 15:29:02.441220 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:29:02 crc kubenswrapper[4965]: I1125 15:29:02.530184 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlwlh"] Nov 25 15:29:03 crc kubenswrapper[4965]: I1125 15:29:03.166185 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 25 15:29:04 crc kubenswrapper[4965]: I1125 15:29:04.374642 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wlwlh" podUID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerName="registry-server" containerID="cri-o://b90142a02e2520046ff474ba3f5e2949b5bdc01e000f9add863afbd6d147ed0e" gracePeriod=2 Nov 25 15:29:06 crc kubenswrapper[4965]: I1125 15:29:06.393814 4965 generic.go:334] "Generic (PLEG): container finished" podID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerID="b90142a02e2520046ff474ba3f5e2949b5bdc01e000f9add863afbd6d147ed0e" exitCode=0 Nov 25 15:29:06 crc kubenswrapper[4965]: I1125 15:29:06.394283 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlwlh" event={"ID":"d35a1980-655b-4a74-a033-0d77c5f6f4a1","Type":"ContainerDied","Data":"b90142a02e2520046ff474ba3f5e2949b5bdc01e000f9add863afbd6d147ed0e"} Nov 25 15:29:06 crc kubenswrapper[4965]: I1125 15:29:06.804385 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:29:06 crc kubenswrapper[4965]: I1125 15:29:06.948410 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-utilities\") pod \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " Nov 25 15:29:06 crc kubenswrapper[4965]: I1125 15:29:06.948494 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-catalog-content\") pod \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " Nov 25 15:29:06 crc kubenswrapper[4965]: I1125 15:29:06.948534 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zpzx\" (UniqueName: \"kubernetes.io/projected/d35a1980-655b-4a74-a033-0d77c5f6f4a1-kube-api-access-4zpzx\") pod \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\" (UID: \"d35a1980-655b-4a74-a033-0d77c5f6f4a1\") " Nov 25 15:29:06 crc kubenswrapper[4965]: I1125 15:29:06.950778 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-utilities" (OuterVolumeSpecName: "utilities") pod "d35a1980-655b-4a74-a033-0d77c5f6f4a1" (UID: "d35a1980-655b-4a74-a033-0d77c5f6f4a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:29:06 crc kubenswrapper[4965]: I1125 15:29:06.954195 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d35a1980-655b-4a74-a033-0d77c5f6f4a1-kube-api-access-4zpzx" (OuterVolumeSpecName: "kube-api-access-4zpzx") pod "d35a1980-655b-4a74-a033-0d77c5f6f4a1" (UID: "d35a1980-655b-4a74-a033-0d77c5f6f4a1"). InnerVolumeSpecName "kube-api-access-4zpzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.042002 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d35a1980-655b-4a74-a033-0d77c5f6f4a1" (UID: "d35a1980-655b-4a74-a033-0d77c5f6f4a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.050437 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.050483 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d35a1980-655b-4a74-a033-0d77c5f6f4a1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.050497 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zpzx\" (UniqueName: \"kubernetes.io/projected/d35a1980-655b-4a74-a033-0d77c5f6f4a1-kube-api-access-4zpzx\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.404283 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlwlh" event={"ID":"d35a1980-655b-4a74-a033-0d77c5f6f4a1","Type":"ContainerDied","Data":"1ab1361031bc886109e6fc8ed89c1d956a15b252980347854d3d8cf63ba25916"} Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.404339 4965 scope.go:117] "RemoveContainer" containerID="b90142a02e2520046ff474ba3f5e2949b5bdc01e000f9add863afbd6d147ed0e" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.404381 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlwlh" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.408693 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79bce5f5-733f-4651-8fa2-6756ff1993ee","Type":"ContainerStarted","Data":"aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d"} Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.409960 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.442313 4965 scope.go:117] "RemoveContainer" containerID="624ab4f03f6482c3d9acf9b9f418a5c0ce370eb7b70c8cb982b1fb6dbe2f41cd" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.443724 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.494117475 podStartE2EDuration="13.443711489s" podCreationTimestamp="2025-11-25 15:28:54 +0000 UTC" firstStartedPulling="2025-11-25 15:28:55.227065954 +0000 UTC m=+1480.194659700" lastFinishedPulling="2025-11-25 15:29:06.176659968 +0000 UTC m=+1491.144253714" observedRunningTime="2025-11-25 15:29:07.439321029 +0000 UTC m=+1492.406914775" watchObservedRunningTime="2025-11-25 15:29:07.443711489 +0000 UTC m=+1492.411305245" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.463112 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlwlh"] Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.465636 4965 scope.go:117] "RemoveContainer" containerID="7c81b4925b12b054a176e904f6fc38d5cc55515357cf1eb3bf0656c597ee652f" Nov 25 15:29:07 crc kubenswrapper[4965]: I1125 15:29:07.481911 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wlwlh"] Nov 25 15:29:08 crc kubenswrapper[4965]: I1125 15:29:08.783729 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" path="/var/lib/kubelet/pods/d35a1980-655b-4a74-a033-0d77c5f6f4a1/volumes" Nov 25 15:29:21 crc kubenswrapper[4965]: I1125 15:29:21.589128 4965 generic.go:334] "Generic (PLEG): container finished" podID="c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069" containerID="334477545a2b33183c8778624e141b60ac0c37b3664304d1268e5cef3619157a" exitCode=0 Nov 25 15:29:21 crc kubenswrapper[4965]: I1125 15:29:21.589167 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hkzn5" event={"ID":"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069","Type":"ContainerDied","Data":"334477545a2b33183c8778624e141b60ac0c37b3664304d1268e5cef3619157a"} Nov 25 15:29:22 crc kubenswrapper[4965]: I1125 15:29:22.941041 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.054816 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxkl4\" (UniqueName: \"kubernetes.io/projected/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-kube-api-access-fxkl4\") pod \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.054938 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-scripts\") pod \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.055055 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-combined-ca-bundle\") pod \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.055100 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-config-data\") pod \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\" (UID: \"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069\") " Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.060428 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-scripts" (OuterVolumeSpecName: "scripts") pod "c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069" (UID: "c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.065306 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-kube-api-access-fxkl4" (OuterVolumeSpecName: "kube-api-access-fxkl4") pod "c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069" (UID: "c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069"). InnerVolumeSpecName "kube-api-access-fxkl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.085567 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069" (UID: "c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.101651 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-config-data" (OuterVolumeSpecName: "config-data") pod "c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069" (UID: "c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.157505 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.157539 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.157555 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.157568 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxkl4\" (UniqueName: \"kubernetes.io/projected/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069-kube-api-access-fxkl4\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.260801 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.260867 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.607630 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hkzn5" event={"ID":"c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069","Type":"ContainerDied","Data":"a22f5a9dccd39d66dc874c5cf4ecb7576190f4aa5b5637d20a7ee9929ebb8da9"} Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.607675 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a22f5a9dccd39d66dc874c5cf4ecb7576190f4aa5b5637d20a7ee9929ebb8da9" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.607728 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hkzn5" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.717350 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 15:29:23 crc kubenswrapper[4965]: E1125 15:29:23.717818 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069" containerName="nova-cell0-conductor-db-sync" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.717840 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069" containerName="nova-cell0-conductor-db-sync" Nov 25 15:29:23 crc kubenswrapper[4965]: E1125 15:29:23.717856 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerName="extract-utilities" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.717862 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerName="extract-utilities" Nov 25 15:29:23 crc kubenswrapper[4965]: E1125 15:29:23.717872 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerName="extract-content" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.717879 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerName="extract-content" Nov 25 15:29:23 crc kubenswrapper[4965]: E1125 15:29:23.717896 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerName="registry-server" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.717902 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerName="registry-server" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.718068 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069" containerName="nova-cell0-conductor-db-sync" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.718089 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="d35a1980-655b-4a74-a033-0d77c5f6f4a1" containerName="registry-server" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.718651 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.722960 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.723869 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vz49h" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.741031 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.870669 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd74f74-0867-4dd1-a28e-18cd43f2b3f6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6\") " pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.870836 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd74f74-0867-4dd1-a28e-18cd43f2b3f6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6\") " pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.870864 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wl5k\" (UniqueName: \"kubernetes.io/projected/bcd74f74-0867-4dd1-a28e-18cd43f2b3f6-kube-api-access-4wl5k\") pod \"nova-cell0-conductor-0\" (UID: \"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6\") " pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.972725 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd74f74-0867-4dd1-a28e-18cd43f2b3f6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6\") " pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.972794 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wl5k\" (UniqueName: \"kubernetes.io/projected/bcd74f74-0867-4dd1-a28e-18cd43f2b3f6-kube-api-access-4wl5k\") pod \"nova-cell0-conductor-0\" (UID: \"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6\") " pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.972900 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd74f74-0867-4dd1-a28e-18cd43f2b3f6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6\") " pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.981433 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd74f74-0867-4dd1-a28e-18cd43f2b3f6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6\") " pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.989168 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd74f74-0867-4dd1-a28e-18cd43f2b3f6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6\") " pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:23 crc kubenswrapper[4965]: I1125 15:29:23.989880 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wl5k\" (UniqueName: \"kubernetes.io/projected/bcd74f74-0867-4dd1-a28e-18cd43f2b3f6-kube-api-access-4wl5k\") pod \"nova-cell0-conductor-0\" (UID: \"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6\") " pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:24 crc kubenswrapper[4965]: I1125 15:29:24.042199 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:24 crc kubenswrapper[4965]: I1125 15:29:24.493869 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 15:29:24 crc kubenswrapper[4965]: I1125 15:29:24.616500 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6","Type":"ContainerStarted","Data":"2fd87dadabc89333ab903a8a03fa7c000896a378abd4ad2e9e5b40aaddcbd52f"} Nov 25 15:29:24 crc kubenswrapper[4965]: I1125 15:29:24.758398 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 15:29:25 crc kubenswrapper[4965]: I1125 15:29:25.628892 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcd74f74-0867-4dd1-a28e-18cd43f2b3f6","Type":"ContainerStarted","Data":"d55f6169471f46a595cb26dd1ad8b6b4bb38a9249bd75d59b1bffb46468043e9"} Nov 25 15:29:25 crc kubenswrapper[4965]: I1125 15:29:25.628984 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:25 crc kubenswrapper[4965]: I1125 15:29:25.646298 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.646280528 podStartE2EDuration="2.646280528s" podCreationTimestamp="2025-11-25 15:29:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:29:25.641304972 +0000 UTC m=+1510.608898738" watchObservedRunningTime="2025-11-25 15:29:25.646280528 +0000 UTC m=+1510.613874274" Nov 25 15:29:29 crc kubenswrapper[4965]: I1125 15:29:29.419886 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 25 15:29:29 crc kubenswrapper[4965]: I1125 15:29:29.998979 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-qmnc9"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.000344 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.021189 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.021573 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.027386 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qmnc9"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.079560 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-scripts\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.079869 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.079892 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-config-data\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.079919 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsfwv\" (UniqueName: \"kubernetes.io/projected/697f5736-bc04-45e5-a874-982e8cd7d8e3-kube-api-access-jsfwv\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.163211 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.164678 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.169042 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.180907 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-scripts\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.181117 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.181209 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-config-data\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.181286 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsfwv\" (UniqueName: \"kubernetes.io/projected/697f5736-bc04-45e5-a874-982e8cd7d8e3-kube-api-access-jsfwv\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.187419 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-scripts\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.202568 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.209850 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-config-data\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.230111 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.274285 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsfwv\" (UniqueName: \"kubernetes.io/projected/697f5736-bc04-45e5-a874-982e8cd7d8e3-kube-api-access-jsfwv\") pod \"nova-cell0-cell-mapping-qmnc9\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.284882 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.285951 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69khl\" (UniqueName: \"kubernetes.io/projected/29ff49c0-272e-40b2-a869-f42e82a37bd2-kube-api-access-69khl\") pod \"nova-scheduler-0\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.286059 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.286087 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-config-data\") pod \"nova-scheduler-0\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.298346 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.298460 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.301253 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.334929 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.392732 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.392783 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69khl\" (UniqueName: \"kubernetes.io/projected/29ff49c0-272e-40b2-a869-f42e82a37bd2-kube-api-access-69khl\") pod \"nova-scheduler-0\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.392823 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.392860 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhv6r\" (UniqueName: \"kubernetes.io/projected/a3094974-d007-468c-8c97-455d28c6f1ff-kube-api-access-nhv6r\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.392914 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.392941 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-config-data\") pod \"nova-scheduler-0\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.403455 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.438622 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-config-data\") pod \"nova-scheduler-0\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.440307 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.448543 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69khl\" (UniqueName: \"kubernetes.io/projected/29ff49c0-272e-40b2-a869-f42e82a37bd2-kube-api-access-69khl\") pod \"nova-scheduler-0\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.470611 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.477616 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.483535 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.494931 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.496325 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.496498 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhv6r\" (UniqueName: \"kubernetes.io/projected/a3094974-d007-468c-8c97-455d28c6f1ff-kube-api-access-nhv6r\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.495567 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.498472 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.499176 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.500811 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.511592 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.537685 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.550865 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhv6r\" (UniqueName: \"kubernetes.io/projected/a3094974-d007-468c-8c97-455d28c6f1ff-kube-api-access-nhv6r\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.568001 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.598450 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blfcc\" (UniqueName: \"kubernetes.io/projected/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-kube-api-access-blfcc\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.598673 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13149900-9b99-4922-9e7f-bf309114a01a-logs\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.598756 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-config-data\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.598880 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.598985 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-config-data\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.599087 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.599179 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-logs\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.599275 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85tvt\" (UniqueName: \"kubernetes.io/projected/13149900-9b99-4922-9e7f-bf309114a01a-kube-api-access-85tvt\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.610326 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-s6z25"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.612140 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.643240 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-s6z25"] Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.653835 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.700445 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85tvt\" (UniqueName: \"kubernetes.io/projected/13149900-9b99-4922-9e7f-bf309114a01a-kube-api-access-85tvt\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.700701 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-dns-svc\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.700783 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blfcc\" (UniqueName: \"kubernetes.io/projected/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-kube-api-access-blfcc\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.700854 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13149900-9b99-4922-9e7f-bf309114a01a-logs\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.700939 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-config-data\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.701024 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qtj\" (UniqueName: \"kubernetes.io/projected/db43ec48-b8c9-4a63-960d-aaf762b4e184-kube-api-access-s9qtj\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.701157 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.701240 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-config\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.701387 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-config-data\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.701493 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.701585 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.701661 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-logs\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.701739 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.702708 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13149900-9b99-4922-9e7f-bf309114a01a-logs\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.706941 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-logs\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.708479 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.709094 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-config-data\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.711897 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.713181 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-config-data\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.747890 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85tvt\" (UniqueName: \"kubernetes.io/projected/13149900-9b99-4922-9e7f-bf309114a01a-kube-api-access-85tvt\") pod \"nova-metadata-0\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.758502 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blfcc\" (UniqueName: \"kubernetes.io/projected/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-kube-api-access-blfcc\") pod \"nova-api-0\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.803598 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-dns-svc\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.803953 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qtj\" (UniqueName: \"kubernetes.io/projected/db43ec48-b8c9-4a63-960d-aaf762b4e184-kube-api-access-s9qtj\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.804159 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-config\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.804317 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.804407 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.805417 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.807378 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.807904 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-dns-svc\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.813706 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-config\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.819296 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.843414 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qtj\" (UniqueName: \"kubernetes.io/projected/db43ec48-b8c9-4a63-960d-aaf762b4e184-kube-api-access-s9qtj\") pod \"dnsmasq-dns-566b5b7845-s6z25\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.899816 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:29:30 crc kubenswrapper[4965]: I1125 15:29:30.974397 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:31 crc kubenswrapper[4965]: I1125 15:29:31.386617 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qmnc9"] Nov 25 15:29:31 crc kubenswrapper[4965]: I1125 15:29:31.506021 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:29:31 crc kubenswrapper[4965]: I1125 15:29:31.743172 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29ff49c0-272e-40b2-a869-f42e82a37bd2","Type":"ContainerStarted","Data":"7fea9478d758c0386384bd097a515b401c6a07e2f21d10b9c57b18213656f4d2"} Nov 25 15:29:31 crc kubenswrapper[4965]: I1125 15:29:31.746476 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qmnc9" event={"ID":"697f5736-bc04-45e5-a874-982e8cd7d8e3","Type":"ContainerStarted","Data":"2894207bdfbd35cb7fa3487f03d0449441017e89bd619b80a313ff29517e41ae"} Nov 25 15:29:31 crc kubenswrapper[4965]: I1125 15:29:31.796551 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:29:31 crc kubenswrapper[4965]: I1125 15:29:31.836259 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:31 crc kubenswrapper[4965]: I1125 15:29:31.973046 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 15:29:31 crc kubenswrapper[4965]: I1125 15:29:31.993590 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-s6z25"] Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.173515 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r7rws"] Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.174940 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.176907 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.177339 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.190629 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r7rws"] Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.244858 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77d8r\" (UniqueName: \"kubernetes.io/projected/814b2e75-fe75-4874-b493-f739902f9202-kube-api-access-77d8r\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.244933 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-config-data\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.244955 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-scripts\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.245034 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.346323 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.346645 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77d8r\" (UniqueName: \"kubernetes.io/projected/814b2e75-fe75-4874-b493-f739902f9202-kube-api-access-77d8r\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.346702 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-config-data\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.346725 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-scripts\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.352344 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.353338 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-scripts\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.354476 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-config-data\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.368606 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77d8r\" (UniqueName: \"kubernetes.io/projected/814b2e75-fe75-4874-b493-f739902f9202-kube-api-access-77d8r\") pod \"nova-cell1-conductor-db-sync-r7rws\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.565613 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.759769 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f","Type":"ContainerStarted","Data":"988580a5cde730bb491b90e70be37950b0f4fddcb6fa6787e652f8ae8e010d0f"} Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.762663 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3094974-d007-468c-8c97-455d28c6f1ff","Type":"ContainerStarted","Data":"46dc8a213f8c43af39f90cc38c625a7102e97387605b733483511485403751be"} Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.765378 4965 generic.go:334] "Generic (PLEG): container finished" podID="db43ec48-b8c9-4a63-960d-aaf762b4e184" containerID="957d5304ffccad3a9264df3e5663d6c098d57ecddb1ad72013ce2fa91b99aac5" exitCode=0 Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.765452 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" event={"ID":"db43ec48-b8c9-4a63-960d-aaf762b4e184","Type":"ContainerDied","Data":"957d5304ffccad3a9264df3e5663d6c098d57ecddb1ad72013ce2fa91b99aac5"} Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.765477 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" event={"ID":"db43ec48-b8c9-4a63-960d-aaf762b4e184","Type":"ContainerStarted","Data":"11dee7a3392af7881f9f6ccc686fb7c547484d4274a2c00005bc241e0d87afba"} Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.768325 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13149900-9b99-4922-9e7f-bf309114a01a","Type":"ContainerStarted","Data":"4d5022442b911e320fd03b8fe5464533eb4c63b064aa657d9c69f70c2b9275b9"} Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.806372 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qmnc9" event={"ID":"697f5736-bc04-45e5-a874-982e8cd7d8e3","Type":"ContainerStarted","Data":"22c5bab323a242afa2cf3d32c11e12c884aa608430fc5ea7d63e62abf8bcbdfb"} Nov 25 15:29:32 crc kubenswrapper[4965]: I1125 15:29:32.846118 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-qmnc9" podStartSLOduration=3.846016453 podStartE2EDuration="3.846016453s" podCreationTimestamp="2025-11-25 15:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:29:32.843244207 +0000 UTC m=+1517.810837963" watchObservedRunningTime="2025-11-25 15:29:32.846016453 +0000 UTC m=+1517.813610199" Nov 25 15:29:33 crc kubenswrapper[4965]: I1125 15:29:33.263645 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r7rws"] Nov 25 15:29:33 crc kubenswrapper[4965]: I1125 15:29:33.833752 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r7rws" event={"ID":"814b2e75-fe75-4874-b493-f739902f9202","Type":"ContainerStarted","Data":"ed481eb3ce30f2137a54ae751deaa04eb094f580528e150af5ce7c6c578f0cb4"} Nov 25 15:29:33 crc kubenswrapper[4965]: I1125 15:29:33.840171 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" event={"ID":"db43ec48-b8c9-4a63-960d-aaf762b4e184","Type":"ContainerStarted","Data":"25bd374fe6559789b413b1c6550728ef13fad5dfda716db96d0dd7739fff1a8d"} Nov 25 15:29:33 crc kubenswrapper[4965]: I1125 15:29:33.841386 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:33 crc kubenswrapper[4965]: I1125 15:29:33.870156 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" podStartSLOduration=3.870135886 podStartE2EDuration="3.870135886s" podCreationTimestamp="2025-11-25 15:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:29:33.862481367 +0000 UTC m=+1518.830075113" watchObservedRunningTime="2025-11-25 15:29:33.870135886 +0000 UTC m=+1518.837729632" Nov 25 15:29:34 crc kubenswrapper[4965]: I1125 15:29:34.620706 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:34 crc kubenswrapper[4965]: I1125 15:29:34.627994 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 15:29:34 crc kubenswrapper[4965]: I1125 15:29:34.854031 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r7rws" event={"ID":"814b2e75-fe75-4874-b493-f739902f9202","Type":"ContainerStarted","Data":"e3e6372085103461a2828509236bdbd47361dcb86a73d0c276e3e6e89ef91891"} Nov 25 15:29:34 crc kubenswrapper[4965]: I1125 15:29:34.884499 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-r7rws" podStartSLOduration=2.884479072 podStartE2EDuration="2.884479072s" podCreationTimestamp="2025-11-25 15:29:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:29:34.876798483 +0000 UTC m=+1519.844392229" watchObservedRunningTime="2025-11-25 15:29:34.884479072 +0000 UTC m=+1519.852072818" Nov 25 15:29:39 crc kubenswrapper[4965]: I1125 15:29:39.896017 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29ff49c0-272e-40b2-a869-f42e82a37bd2","Type":"ContainerStarted","Data":"013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c"} Nov 25 15:29:39 crc kubenswrapper[4965]: I1125 15:29:39.900353 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3094974-d007-468c-8c97-455d28c6f1ff","Type":"ContainerStarted","Data":"5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7"} Nov 25 15:29:39 crc kubenswrapper[4965]: I1125 15:29:39.900477 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a3094974-d007-468c-8c97-455d28c6f1ff" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7" gracePeriod=30 Nov 25 15:29:39 crc kubenswrapper[4965]: I1125 15:29:39.904750 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13149900-9b99-4922-9e7f-bf309114a01a","Type":"ContainerStarted","Data":"c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939"} Nov 25 15:29:39 crc kubenswrapper[4965]: I1125 15:29:39.910152 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f","Type":"ContainerStarted","Data":"af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f"} Nov 25 15:29:39 crc kubenswrapper[4965]: I1125 15:29:39.922394 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.207929703 podStartE2EDuration="9.922369832s" podCreationTimestamp="2025-11-25 15:29:30 +0000 UTC" firstStartedPulling="2025-11-25 15:29:31.546997409 +0000 UTC m=+1516.514591155" lastFinishedPulling="2025-11-25 15:29:39.261437538 +0000 UTC m=+1524.229031284" observedRunningTime="2025-11-25 15:29:39.916014578 +0000 UTC m=+1524.883608334" watchObservedRunningTime="2025-11-25 15:29:39.922369832 +0000 UTC m=+1524.889963578" Nov 25 15:29:39 crc kubenswrapper[4965]: I1125 15:29:39.943199 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.662940757 podStartE2EDuration="9.943180259s" podCreationTimestamp="2025-11-25 15:29:30 +0000 UTC" firstStartedPulling="2025-11-25 15:29:31.984359372 +0000 UTC m=+1516.951953118" lastFinishedPulling="2025-11-25 15:29:39.264598874 +0000 UTC m=+1524.232192620" observedRunningTime="2025-11-25 15:29:39.941615487 +0000 UTC m=+1524.909209233" watchObservedRunningTime="2025-11-25 15:29:39.943180259 +0000 UTC m=+1524.910774005" Nov 25 15:29:39 crc kubenswrapper[4965]: I1125 15:29:39.965618 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.330858097 podStartE2EDuration="9.965589461s" podCreationTimestamp="2025-11-25 15:29:30 +0000 UTC" firstStartedPulling="2025-11-25 15:29:31.792687153 +0000 UTC m=+1516.760280899" lastFinishedPulling="2025-11-25 15:29:39.427418507 +0000 UTC m=+1524.395012263" observedRunningTime="2025-11-25 15:29:39.964687616 +0000 UTC m=+1524.932281362" watchObservedRunningTime="2025-11-25 15:29:39.965589461 +0000 UTC m=+1524.933183207" Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.484800 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.485100 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.512472 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.656009 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.900605 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.900684 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.924785 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13149900-9b99-4922-9e7f-bf309114a01a","Type":"ContainerStarted","Data":"d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497"} Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.924841 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="13149900-9b99-4922-9e7f-bf309114a01a" containerName="nova-metadata-log" containerID="cri-o://c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939" gracePeriod=30 Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.925015 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="13149900-9b99-4922-9e7f-bf309114a01a" containerName="nova-metadata-metadata" containerID="cri-o://d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497" gracePeriod=30 Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.929040 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f","Type":"ContainerStarted","Data":"176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44"} Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.946862 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.336447374 podStartE2EDuration="10.946822763s" podCreationTimestamp="2025-11-25 15:29:30 +0000 UTC" firstStartedPulling="2025-11-25 15:29:31.851095327 +0000 UTC m=+1516.818689073" lastFinishedPulling="2025-11-25 15:29:39.461470716 +0000 UTC m=+1524.429064462" observedRunningTime="2025-11-25 15:29:40.941367235 +0000 UTC m=+1525.908960981" watchObservedRunningTime="2025-11-25 15:29:40.946822763 +0000 UTC m=+1525.914416509" Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.974701 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 15:29:40 crc kubenswrapper[4965]: I1125 15:29:40.977183 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.100915 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-d6mx8"] Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.127674 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" podUID="bc935d6a-b651-4c29-9d70-5b9abc6c8580" containerName="dnsmasq-dns" containerID="cri-o://528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0" gracePeriod=10 Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.785064 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.887806 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-nb\") pod \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.887889 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b24b8\" (UniqueName: \"kubernetes.io/projected/bc935d6a-b651-4c29-9d70-5b9abc6c8580-kube-api-access-b24b8\") pod \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.888026 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-config\") pod \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.888089 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-dns-svc\") pod \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.888159 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-sb\") pod \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\" (UID: \"bc935d6a-b651-4c29-9d70-5b9abc6c8580\") " Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.909122 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc935d6a-b651-4c29-9d70-5b9abc6c8580-kube-api-access-b24b8" (OuterVolumeSpecName: "kube-api-access-b24b8") pod "bc935d6a-b651-4c29-9d70-5b9abc6c8580" (UID: "bc935d6a-b651-4c29-9d70-5b9abc6c8580"). InnerVolumeSpecName "kube-api-access-b24b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.962372 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.981243 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc935d6a-b651-4c29-9d70-5b9abc6c8580" (UID: "bc935d6a-b651-4c29-9d70-5b9abc6c8580"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.986353 4965 generic.go:334] "Generic (PLEG): container finished" podID="bc935d6a-b651-4c29-9d70-5b9abc6c8580" containerID="528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0" exitCode=0 Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.986441 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" event={"ID":"bc935d6a-b651-4c29-9d70-5b9abc6c8580","Type":"ContainerDied","Data":"528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0"} Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.986467 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" event={"ID":"bc935d6a-b651-4c29-9d70-5b9abc6c8580","Type":"ContainerDied","Data":"72a8e390831b36bdb794d23575d36f78b3b90de66e8047ced30e5d4659aac04e"} Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.986484 4965 scope.go:117] "RemoveContainer" containerID="528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0" Nov 25 15:29:41 crc kubenswrapper[4965]: I1125 15:29:41.989112 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-d6mx8" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:41.990287 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85tvt\" (UniqueName: \"kubernetes.io/projected/13149900-9b99-4922-9e7f-bf309114a01a-kube-api-access-85tvt\") pod \"13149900-9b99-4922-9e7f-bf309114a01a\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:41.990445 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13149900-9b99-4922-9e7f-bf309114a01a-logs\") pod \"13149900-9b99-4922-9e7f-bf309114a01a\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:41.990473 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-combined-ca-bundle\") pod \"13149900-9b99-4922-9e7f-bf309114a01a\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:41.990552 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-config-data\") pod \"13149900-9b99-4922-9e7f-bf309114a01a\" (UID: \"13149900-9b99-4922-9e7f-bf309114a01a\") " Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:41.991536 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.173:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:41.991818 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.173:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:41.991925 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:41.991999 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b24b8\" (UniqueName: \"kubernetes.io/projected/bc935d6a-b651-4c29-9d70-5b9abc6c8580-kube-api-access-b24b8\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:41.992191 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13149900-9b99-4922-9e7f-bf309114a01a-logs" (OuterVolumeSpecName: "logs") pod "13149900-9b99-4922-9e7f-bf309114a01a" (UID: "13149900-9b99-4922-9e7f-bf309114a01a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.005171 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.008230 4965 generic.go:334] "Generic (PLEG): container finished" podID="13149900-9b99-4922-9e7f-bf309114a01a" containerID="d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497" exitCode=0 Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.008277 4965 generic.go:334] "Generic (PLEG): container finished" podID="13149900-9b99-4922-9e7f-bf309114a01a" containerID="c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939" exitCode=143 Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.008585 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13149900-9b99-4922-9e7f-bf309114a01a","Type":"ContainerDied","Data":"d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497"} Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.008627 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13149900-9b99-4922-9e7f-bf309114a01a","Type":"ContainerDied","Data":"c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939"} Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.008639 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13149900-9b99-4922-9e7f-bf309114a01a","Type":"ContainerDied","Data":"4d5022442b911e320fd03b8fe5464533eb4c63b064aa657d9c69f70c2b9275b9"} Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.020373 4965 generic.go:334] "Generic (PLEG): container finished" podID="697f5736-bc04-45e5-a874-982e8cd7d8e3" containerID="22c5bab323a242afa2cf3d32c11e12c884aa608430fc5ea7d63e62abf8bcbdfb" exitCode=0 Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.021484 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qmnc9" event={"ID":"697f5736-bc04-45e5-a874-982e8cd7d8e3","Type":"ContainerDied","Data":"22c5bab323a242afa2cf3d32c11e12c884aa608430fc5ea7d63e62abf8bcbdfb"} Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.038619 4965 scope.go:117] "RemoveContainer" containerID="6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.043167 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13149900-9b99-4922-9e7f-bf309114a01a-kube-api-access-85tvt" (OuterVolumeSpecName: "kube-api-access-85tvt") pod "13149900-9b99-4922-9e7f-bf309114a01a" (UID: "13149900-9b99-4922-9e7f-bf309114a01a"). InnerVolumeSpecName "kube-api-access-85tvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.060253 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc935d6a-b651-4c29-9d70-5b9abc6c8580" (UID: "bc935d6a-b651-4c29-9d70-5b9abc6c8580"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.061995 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13149900-9b99-4922-9e7f-bf309114a01a" (UID: "13149900-9b99-4922-9e7f-bf309114a01a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.091481 4965 scope.go:117] "RemoveContainer" containerID="528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0" Nov 25 15:29:42 crc kubenswrapper[4965]: E1125 15:29:42.091849 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0\": container with ID starting with 528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0 not found: ID does not exist" containerID="528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.091898 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0"} err="failed to get container status \"528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0\": rpc error: code = NotFound desc = could not find container \"528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0\": container with ID starting with 528ecdd57c7318b26fd0b05022c7733de10c563c1a9142bee6b9b97b659a03b0 not found: ID does not exist" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.091919 4965 scope.go:117] "RemoveContainer" containerID="6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.096741 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85tvt\" (UniqueName: \"kubernetes.io/projected/13149900-9b99-4922-9e7f-bf309114a01a-kube-api-access-85tvt\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.096767 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.096777 4965 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13149900-9b99-4922-9e7f-bf309114a01a-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.096787 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.097360 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-config" (OuterVolumeSpecName: "config") pod "bc935d6a-b651-4c29-9d70-5b9abc6c8580" (UID: "bc935d6a-b651-4c29-9d70-5b9abc6c8580"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:29:42 crc kubenswrapper[4965]: E1125 15:29:42.097503 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb\": container with ID starting with 6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb not found: ID does not exist" containerID="6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.097539 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb"} err="failed to get container status \"6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb\": rpc error: code = NotFound desc = could not find container \"6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb\": container with ID starting with 6fce3ccc2e50429992b9ba951f5c50e3dd23bd1d238ecb9eb6e1eae3da0ff3cb not found: ID does not exist" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.097566 4965 scope.go:117] "RemoveContainer" containerID="d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.103841 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc935d6a-b651-4c29-9d70-5b9abc6c8580" (UID: "bc935d6a-b651-4c29-9d70-5b9abc6c8580"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.108667 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-config-data" (OuterVolumeSpecName: "config-data") pod "13149900-9b99-4922-9e7f-bf309114a01a" (UID: "13149900-9b99-4922-9e7f-bf309114a01a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.118641 4965 scope.go:117] "RemoveContainer" containerID="c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.147316 4965 scope.go:117] "RemoveContainer" containerID="d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497" Nov 25 15:29:42 crc kubenswrapper[4965]: E1125 15:29:42.148222 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497\": container with ID starting with d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497 not found: ID does not exist" containerID="d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.148255 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497"} err="failed to get container status \"d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497\": rpc error: code = NotFound desc = could not find container \"d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497\": container with ID starting with d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497 not found: ID does not exist" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.148280 4965 scope.go:117] "RemoveContainer" containerID="c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939" Nov 25 15:29:42 crc kubenswrapper[4965]: E1125 15:29:42.148490 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939\": container with ID starting with c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939 not found: ID does not exist" containerID="c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.148518 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939"} err="failed to get container status \"c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939\": rpc error: code = NotFound desc = could not find container \"c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939\": container with ID starting with c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939 not found: ID does not exist" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.148532 4965 scope.go:117] "RemoveContainer" containerID="d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.148692 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497"} err="failed to get container status \"d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497\": rpc error: code = NotFound desc = could not find container \"d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497\": container with ID starting with d682e856ee4afd9204c32dc4420bed9baada85743738e3a028a37a5bd4a0b497 not found: ID does not exist" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.148713 4965 scope.go:117] "RemoveContainer" containerID="c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.148881 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939"} err="failed to get container status \"c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939\": rpc error: code = NotFound desc = could not find container \"c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939\": container with ID starting with c084796ac97a09422b63f369bb4798132356ae30af9ef88805bf89b6c316d939 not found: ID does not exist" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.198089 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.198129 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13149900-9b99-4922-9e7f-bf309114a01a-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.198139 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc935d6a-b651-4c29-9d70-5b9abc6c8580-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.330530 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-d6mx8"] Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.356797 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-d6mx8"] Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.383210 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.396030 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.408432 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:42 crc kubenswrapper[4965]: E1125 15:29:42.408849 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13149900-9b99-4922-9e7f-bf309114a01a" containerName="nova-metadata-metadata" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.408865 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="13149900-9b99-4922-9e7f-bf309114a01a" containerName="nova-metadata-metadata" Nov 25 15:29:42 crc kubenswrapper[4965]: E1125 15:29:42.408874 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13149900-9b99-4922-9e7f-bf309114a01a" containerName="nova-metadata-log" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.408881 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="13149900-9b99-4922-9e7f-bf309114a01a" containerName="nova-metadata-log" Nov 25 15:29:42 crc kubenswrapper[4965]: E1125 15:29:42.408894 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc935d6a-b651-4c29-9d70-5b9abc6c8580" containerName="init" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.408899 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc935d6a-b651-4c29-9d70-5b9abc6c8580" containerName="init" Nov 25 15:29:42 crc kubenswrapper[4965]: E1125 15:29:42.408909 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc935d6a-b651-4c29-9d70-5b9abc6c8580" containerName="dnsmasq-dns" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.408914 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc935d6a-b651-4c29-9d70-5b9abc6c8580" containerName="dnsmasq-dns" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.409126 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc935d6a-b651-4c29-9d70-5b9abc6c8580" containerName="dnsmasq-dns" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.409146 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="13149900-9b99-4922-9e7f-bf309114a01a" containerName="nova-metadata-metadata" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.409164 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="13149900-9b99-4922-9e7f-bf309114a01a" containerName="nova-metadata-log" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.410113 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.413258 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.413430 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.414491 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.503026 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz5lx\" (UniqueName: \"kubernetes.io/projected/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-kube-api-access-dz5lx\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.503080 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.503122 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-config-data\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.503202 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-logs\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.503254 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.604517 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz5lx\" (UniqueName: \"kubernetes.io/projected/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-kube-api-access-dz5lx\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.604570 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.605536 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-config-data\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.605591 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-logs\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.605630 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.606066 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-logs\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.609605 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-config-data\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.613446 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.613906 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.623616 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz5lx\" (UniqueName: \"kubernetes.io/projected/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-kube-api-access-dz5lx\") pod \"nova-metadata-0\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.739239 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.895744 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13149900-9b99-4922-9e7f-bf309114a01a" path="/var/lib/kubelet/pods/13149900-9b99-4922-9e7f-bf309114a01a/volumes" Nov 25 15:29:42 crc kubenswrapper[4965]: I1125 15:29:42.896440 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc935d6a-b651-4c29-9d70-5b9abc6c8580" path="/var/lib/kubelet/pods/bc935d6a-b651-4c29-9d70-5b9abc6c8580/volumes" Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.320797 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.486856 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.519680 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsfwv\" (UniqueName: \"kubernetes.io/projected/697f5736-bc04-45e5-a874-982e8cd7d8e3-kube-api-access-jsfwv\") pod \"697f5736-bc04-45e5-a874-982e8cd7d8e3\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.519782 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-scripts\") pod \"697f5736-bc04-45e5-a874-982e8cd7d8e3\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.519873 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-combined-ca-bundle\") pod \"697f5736-bc04-45e5-a874-982e8cd7d8e3\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.519945 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-config-data\") pod \"697f5736-bc04-45e5-a874-982e8cd7d8e3\" (UID: \"697f5736-bc04-45e5-a874-982e8cd7d8e3\") " Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.547134 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-scripts" (OuterVolumeSpecName: "scripts") pod "697f5736-bc04-45e5-a874-982e8cd7d8e3" (UID: "697f5736-bc04-45e5-a874-982e8cd7d8e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.556182 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/697f5736-bc04-45e5-a874-982e8cd7d8e3-kube-api-access-jsfwv" (OuterVolumeSpecName: "kube-api-access-jsfwv") pod "697f5736-bc04-45e5-a874-982e8cd7d8e3" (UID: "697f5736-bc04-45e5-a874-982e8cd7d8e3"). InnerVolumeSpecName "kube-api-access-jsfwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.566766 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "697f5736-bc04-45e5-a874-982e8cd7d8e3" (UID: "697f5736-bc04-45e5-a874-982e8cd7d8e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.585609 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-config-data" (OuterVolumeSpecName: "config-data") pod "697f5736-bc04-45e5-a874-982e8cd7d8e3" (UID: "697f5736-bc04-45e5-a874-982e8cd7d8e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.621716 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.621743 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.621753 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsfwv\" (UniqueName: \"kubernetes.io/projected/697f5736-bc04-45e5-a874-982e8cd7d8e3-kube-api-access-jsfwv\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:43 crc kubenswrapper[4965]: I1125 15:29:43.621763 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/697f5736-bc04-45e5-a874-982e8cd7d8e3-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.065767 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qmnc9" Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.066885 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qmnc9" event={"ID":"697f5736-bc04-45e5-a874-982e8cd7d8e3","Type":"ContainerDied","Data":"2894207bdfbd35cb7fa3487f03d0449441017e89bd619b80a313ff29517e41ae"} Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.066953 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2894207bdfbd35cb7fa3487f03d0449441017e89bd619b80a313ff29517e41ae" Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.069210 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c","Type":"ContainerStarted","Data":"edcb1442d96aa44a6029eaa6061be74690fb6b55eed1580484bb644cb9bdbe0a"} Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.069261 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c","Type":"ContainerStarted","Data":"f75200b06789685d958651e45ef85806bce17e59eb65311be59df2487b8777b7"} Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.069282 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c","Type":"ContainerStarted","Data":"7de8a960c74fb202b161051cfbe4daf7737b5bdbbafdc2faa3900df615634ad2"} Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.118247 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.118223096 podStartE2EDuration="2.118223096s" podCreationTimestamp="2025-11-25 15:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:29:44.093739098 +0000 UTC m=+1529.061332854" watchObservedRunningTime="2025-11-25 15:29:44.118223096 +0000 UTC m=+1529.085816842" Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.249041 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.249263 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="29ff49c0-272e-40b2-a869-f42e82a37bd2" containerName="nova-scheduler-scheduler" containerID="cri-o://013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c" gracePeriod=30 Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.263222 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.264138 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerName="nova-api-api" containerID="cri-o://176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44" gracePeriod=30 Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.264068 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerName="nova-api-log" containerID="cri-o://af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f" gracePeriod=30 Nov 25 15:29:44 crc kubenswrapper[4965]: I1125 15:29:44.302737 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.078415 4965 generic.go:334] "Generic (PLEG): container finished" podID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerID="af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f" exitCode=143 Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.078512 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f","Type":"ContainerDied","Data":"af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f"} Nov 25 15:29:45 crc kubenswrapper[4965]: E1125 15:29:45.485365 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c is running failed: container process not found" containerID="013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 15:29:45 crc kubenswrapper[4965]: E1125 15:29:45.486098 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c is running failed: container process not found" containerID="013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 15:29:45 crc kubenswrapper[4965]: E1125 15:29:45.486353 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c is running failed: container process not found" containerID="013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 15:29:45 crc kubenswrapper[4965]: E1125 15:29:45.486388 4965 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="29ff49c0-272e-40b2-a869-f42e82a37bd2" containerName="nova-scheduler-scheduler" Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.558689 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.659634 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-config-data\") pod \"29ff49c0-272e-40b2-a869-f42e82a37bd2\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.659805 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-combined-ca-bundle\") pod \"29ff49c0-272e-40b2-a869-f42e82a37bd2\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.659883 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69khl\" (UniqueName: \"kubernetes.io/projected/29ff49c0-272e-40b2-a869-f42e82a37bd2-kube-api-access-69khl\") pod \"29ff49c0-272e-40b2-a869-f42e82a37bd2\" (UID: \"29ff49c0-272e-40b2-a869-f42e82a37bd2\") " Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.665274 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ff49c0-272e-40b2-a869-f42e82a37bd2-kube-api-access-69khl" (OuterVolumeSpecName: "kube-api-access-69khl") pod "29ff49c0-272e-40b2-a869-f42e82a37bd2" (UID: "29ff49c0-272e-40b2-a869-f42e82a37bd2"). InnerVolumeSpecName "kube-api-access-69khl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.691139 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-config-data" (OuterVolumeSpecName: "config-data") pod "29ff49c0-272e-40b2-a869-f42e82a37bd2" (UID: "29ff49c0-272e-40b2-a869-f42e82a37bd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.695911 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29ff49c0-272e-40b2-a869-f42e82a37bd2" (UID: "29ff49c0-272e-40b2-a869-f42e82a37bd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.762352 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.762391 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ff49c0-272e-40b2-a869-f42e82a37bd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:45 crc kubenswrapper[4965]: I1125 15:29:45.762408 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69khl\" (UniqueName: \"kubernetes.io/projected/29ff49c0-272e-40b2-a869-f42e82a37bd2-kube-api-access-69khl\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.098983 4965 generic.go:334] "Generic (PLEG): container finished" podID="29ff49c0-272e-40b2-a869-f42e82a37bd2" containerID="013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c" exitCode=0 Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.099040 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.099088 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29ff49c0-272e-40b2-a869-f42e82a37bd2","Type":"ContainerDied","Data":"013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c"} Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.099135 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29ff49c0-272e-40b2-a869-f42e82a37bd2","Type":"ContainerDied","Data":"7fea9478d758c0386384bd097a515b401c6a07e2f21d10b9c57b18213656f4d2"} Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.099157 4965 scope.go:117] "RemoveContainer" containerID="013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.099726 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" containerName="nova-metadata-log" containerID="cri-o://f75200b06789685d958651e45ef85806bce17e59eb65311be59df2487b8777b7" gracePeriod=30 Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.099834 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" containerName="nova-metadata-metadata" containerID="cri-o://edcb1442d96aa44a6029eaa6061be74690fb6b55eed1580484bb644cb9bdbe0a" gracePeriod=30 Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.144872 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.147471 4965 scope.go:117] "RemoveContainer" containerID="013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c" Nov 25 15:29:46 crc kubenswrapper[4965]: E1125 15:29:46.148620 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c\": container with ID starting with 013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c not found: ID does not exist" containerID="013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.148676 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c"} err="failed to get container status \"013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c\": rpc error: code = NotFound desc = could not find container \"013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c\": container with ID starting with 013435fc9df514d2e7a656375be8836f18b6b4ea6ecd20a0f034f08f1480871c not found: ID does not exist" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.155126 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.168518 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:29:46 crc kubenswrapper[4965]: E1125 15:29:46.171224 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697f5736-bc04-45e5-a874-982e8cd7d8e3" containerName="nova-manage" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.171308 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="697f5736-bc04-45e5-a874-982e8cd7d8e3" containerName="nova-manage" Nov 25 15:29:46 crc kubenswrapper[4965]: E1125 15:29:46.171356 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ff49c0-272e-40b2-a869-f42e82a37bd2" containerName="nova-scheduler-scheduler" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.171367 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ff49c0-272e-40b2-a869-f42e82a37bd2" containerName="nova-scheduler-scheduler" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.174527 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="697f5736-bc04-45e5-a874-982e8cd7d8e3" containerName="nova-manage" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.174587 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ff49c0-272e-40b2-a869-f42e82a37bd2" containerName="nova-scheduler-scheduler" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.175471 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.179070 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.187200 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.270224 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg8zj\" (UniqueName: \"kubernetes.io/projected/5598c7d5-0a06-43c1-8ed9-94e57832fdea-kube-api-access-tg8zj\") pod \"nova-scheduler-0\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.270328 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-config-data\") pod \"nova-scheduler-0\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.270395 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.372271 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.372417 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg8zj\" (UniqueName: \"kubernetes.io/projected/5598c7d5-0a06-43c1-8ed9-94e57832fdea-kube-api-access-tg8zj\") pod \"nova-scheduler-0\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.372517 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-config-data\") pod \"nova-scheduler-0\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.377835 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-config-data\") pod \"nova-scheduler-0\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.377924 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.389920 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg8zj\" (UniqueName: \"kubernetes.io/projected/5598c7d5-0a06-43c1-8ed9-94e57832fdea-kube-api-access-tg8zj\") pod \"nova-scheduler-0\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.501571 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 15:29:46 crc kubenswrapper[4965]: I1125 15:29:46.783953 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ff49c0-272e-40b2-a869-f42e82a37bd2" path="/var/lib/kubelet/pods/29ff49c0-272e-40b2-a869-f42e82a37bd2/volumes" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.030604 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.111681 4965 generic.go:334] "Generic (PLEG): container finished" podID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" containerID="edcb1442d96aa44a6029eaa6061be74690fb6b55eed1580484bb644cb9bdbe0a" exitCode=0 Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.111706 4965 generic.go:334] "Generic (PLEG): container finished" podID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" containerID="f75200b06789685d958651e45ef85806bce17e59eb65311be59df2487b8777b7" exitCode=143 Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.111749 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c","Type":"ContainerDied","Data":"edcb1442d96aa44a6029eaa6061be74690fb6b55eed1580484bb644cb9bdbe0a"} Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.111773 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c","Type":"ContainerDied","Data":"f75200b06789685d958651e45ef85806bce17e59eb65311be59df2487b8777b7"} Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.111784 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c","Type":"ContainerDied","Data":"7de8a960c74fb202b161051cfbe4daf7737b5bdbbafdc2faa3900df615634ad2"} Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.111793 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de8a960c74fb202b161051cfbe4daf7737b5bdbbafdc2faa3900df615634ad2" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.119103 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5598c7d5-0a06-43c1-8ed9-94e57832fdea","Type":"ContainerStarted","Data":"98abdb9d05599d5f9cac74bc324da73ef76cc1a225978072de898dcc26c50be6"} Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.176111 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.295689 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-nova-metadata-tls-certs\") pod \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.296394 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz5lx\" (UniqueName: \"kubernetes.io/projected/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-kube-api-access-dz5lx\") pod \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.296529 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-config-data\") pod \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.296650 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-combined-ca-bundle\") pod \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.296798 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-logs\") pod \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\" (UID: \"fbcc1c57-2d80-4d4f-bb66-310cfe78c27c\") " Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.297849 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-logs" (OuterVolumeSpecName: "logs") pod "fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" (UID: "fbcc1c57-2d80-4d4f-bb66-310cfe78c27c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.305143 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-kube-api-access-dz5lx" (OuterVolumeSpecName: "kube-api-access-dz5lx") pod "fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" (UID: "fbcc1c57-2d80-4d4f-bb66-310cfe78c27c"). InnerVolumeSpecName "kube-api-access-dz5lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.321710 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-config-data" (OuterVolumeSpecName: "config-data") pod "fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" (UID: "fbcc1c57-2d80-4d4f-bb66-310cfe78c27c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.338915 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" (UID: "fbcc1c57-2d80-4d4f-bb66-310cfe78c27c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.358535 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" (UID: "fbcc1c57-2d80-4d4f-bb66-310cfe78c27c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.399235 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.399308 4965 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.399325 4965 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.399345 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz5lx\" (UniqueName: \"kubernetes.io/projected/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-kube-api-access-dz5lx\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:47 crc kubenswrapper[4965]: I1125 15:29:47.399363 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.133715 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.133786 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5598c7d5-0a06-43c1-8ed9-94e57832fdea","Type":"ContainerStarted","Data":"30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5"} Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.178668 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.178618574 podStartE2EDuration="2.178618574s" podCreationTimestamp="2025-11-25 15:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:29:48.161414495 +0000 UTC m=+1533.129008271" watchObservedRunningTime="2025-11-25 15:29:48.178618574 +0000 UTC m=+1533.146212330" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.197806 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.218146 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.235164 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:48 crc kubenswrapper[4965]: E1125 15:29:48.235764 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" containerName="nova-metadata-log" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.235795 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" containerName="nova-metadata-log" Nov 25 15:29:48 crc kubenswrapper[4965]: E1125 15:29:48.235837 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" containerName="nova-metadata-metadata" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.235849 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" containerName="nova-metadata-metadata" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.236176 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" containerName="nova-metadata-log" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.236218 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" containerName="nova-metadata-metadata" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.237703 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.241220 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.241640 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.265841 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.427307 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc7jr\" (UniqueName: \"kubernetes.io/projected/db135a55-a79d-4cd7-8757-75d9fd2f17d7-kube-api-access-lc7jr\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.427644 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-config-data\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.427709 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.427730 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db135a55-a79d-4cd7-8757-75d9fd2f17d7-logs\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.427953 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.529540 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.529663 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc7jr\" (UniqueName: \"kubernetes.io/projected/db135a55-a79d-4cd7-8757-75d9fd2f17d7-kube-api-access-lc7jr\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.529712 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-config-data\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.529763 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.529785 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db135a55-a79d-4cd7-8757-75d9fd2f17d7-logs\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.530188 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db135a55-a79d-4cd7-8757-75d9fd2f17d7-logs\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.535465 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.544552 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.554712 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-config-data\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.555274 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc7jr\" (UniqueName: \"kubernetes.io/projected/db135a55-a79d-4cd7-8757-75d9fd2f17d7-kube-api-access-lc7jr\") pod \"nova-metadata-0\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.562872 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:29:48 crc kubenswrapper[4965]: I1125 15:29:48.790703 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbcc1c57-2d80-4d4f-bb66-310cfe78c27c" path="/var/lib/kubelet/pods/fbcc1c57-2d80-4d4f-bb66-310cfe78c27c/volumes" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.056182 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.122843 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.154626 4965 generic.go:334] "Generic (PLEG): container finished" podID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerID="176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44" exitCode=0 Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.155625 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f","Type":"ContainerDied","Data":"176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44"} Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.155732 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f","Type":"ContainerDied","Data":"988580a5cde730bb491b90e70be37950b0f4fddcb6fa6787e652f8ae8e010d0f"} Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.155816 4965 scope.go:117] "RemoveContainer" containerID="176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.156017 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.159656 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db135a55-a79d-4cd7-8757-75d9fd2f17d7","Type":"ContainerStarted","Data":"1c267eb0c27e48898bc7baa910f646207f951be4b6e0af48c06383b119c3ac85"} Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.186846 4965 scope.go:117] "RemoveContainer" containerID="af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.240506 4965 scope.go:117] "RemoveContainer" containerID="176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44" Nov 25 15:29:49 crc kubenswrapper[4965]: E1125 15:29:49.241131 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44\": container with ID starting with 176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44 not found: ID does not exist" containerID="176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.241182 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44"} err="failed to get container status \"176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44\": rpc error: code = NotFound desc = could not find container \"176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44\": container with ID starting with 176a1018bd200d170cc878894f0b1566978a516cc0e872bae23088d787977e44 not found: ID does not exist" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.241229 4965 scope.go:117] "RemoveContainer" containerID="af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f" Nov 25 15:29:49 crc kubenswrapper[4965]: E1125 15:29:49.241515 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f\": container with ID starting with af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f not found: ID does not exist" containerID="af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.241541 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f"} err="failed to get container status \"af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f\": rpc error: code = NotFound desc = could not find container \"af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f\": container with ID starting with af3c3090fa78a2af2a4c348dc453a4290ae6f7930dffc244c6a0f565e74cd58f not found: ID does not exist" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.241761 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blfcc\" (UniqueName: \"kubernetes.io/projected/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-kube-api-access-blfcc\") pod \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.241860 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-combined-ca-bundle\") pod \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.241998 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-logs\") pod \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.242060 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-config-data\") pod \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\" (UID: \"4d9bf3a8-0b53-4cde-961f-8c1f85b4929f\") " Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.243443 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-logs" (OuterVolumeSpecName: "logs") pod "4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" (UID: "4d9bf3a8-0b53-4cde-961f-8c1f85b4929f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.248089 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-kube-api-access-blfcc" (OuterVolumeSpecName: "kube-api-access-blfcc") pod "4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" (UID: "4d9bf3a8-0b53-4cde-961f-8c1f85b4929f"). InnerVolumeSpecName "kube-api-access-blfcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.271063 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-config-data" (OuterVolumeSpecName: "config-data") pod "4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" (UID: "4d9bf3a8-0b53-4cde-961f-8c1f85b4929f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.277207 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" (UID: "4d9bf3a8-0b53-4cde-961f-8c1f85b4929f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.343650 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blfcc\" (UniqueName: \"kubernetes.io/projected/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-kube-api-access-blfcc\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.346131 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.346252 4965 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.346327 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.494482 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.504959 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.515163 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 15:29:49 crc kubenswrapper[4965]: E1125 15:29:49.515740 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerName="nova-api-api" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.515851 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerName="nova-api-api" Nov 25 15:29:49 crc kubenswrapper[4965]: E1125 15:29:49.515942 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerName="nova-api-log" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.516040 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerName="nova-api-log" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.516277 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerName="nova-api-log" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.516345 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" containerName="nova-api-api" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.518256 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.523403 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.554592 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.651302 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-logs\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.651365 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.651461 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-config-data\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.651596 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7hp\" (UniqueName: \"kubernetes.io/projected/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-kube-api-access-nr7hp\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.753455 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-config-data\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.753588 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr7hp\" (UniqueName: \"kubernetes.io/projected/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-kube-api-access-nr7hp\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.753633 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-logs\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.753659 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.754148 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-logs\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.759839 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.763632 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-config-data\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.772722 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr7hp\" (UniqueName: \"kubernetes.io/projected/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-kube-api-access-nr7hp\") pod \"nova-api-0\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " pod="openstack/nova-api-0" Nov 25 15:29:49 crc kubenswrapper[4965]: I1125 15:29:49.888411 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:29:50 crc kubenswrapper[4965]: I1125 15:29:50.170994 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db135a55-a79d-4cd7-8757-75d9fd2f17d7","Type":"ContainerStarted","Data":"7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375"} Nov 25 15:29:50 crc kubenswrapper[4965]: I1125 15:29:50.171496 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db135a55-a79d-4cd7-8757-75d9fd2f17d7","Type":"ContainerStarted","Data":"78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290"} Nov 25 15:29:50 crc kubenswrapper[4965]: I1125 15:29:50.208700 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.205505647 podStartE2EDuration="2.205505647s" podCreationTimestamp="2025-11-25 15:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:29:50.204406038 +0000 UTC m=+1535.171999784" watchObservedRunningTime="2025-11-25 15:29:50.205505647 +0000 UTC m=+1535.173099393" Nov 25 15:29:50 crc kubenswrapper[4965]: I1125 15:29:50.460149 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:29:50 crc kubenswrapper[4965]: I1125 15:29:50.783450 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d9bf3a8-0b53-4cde-961f-8c1f85b4929f" path="/var/lib/kubelet/pods/4d9bf3a8-0b53-4cde-961f-8c1f85b4929f/volumes" Nov 25 15:29:51 crc kubenswrapper[4965]: I1125 15:29:51.189953 4965 generic.go:334] "Generic (PLEG): container finished" podID="814b2e75-fe75-4874-b493-f739902f9202" containerID="e3e6372085103461a2828509236bdbd47361dcb86a73d0c276e3e6e89ef91891" exitCode=0 Nov 25 15:29:51 crc kubenswrapper[4965]: I1125 15:29:51.191030 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r7rws" event={"ID":"814b2e75-fe75-4874-b493-f739902f9202","Type":"ContainerDied","Data":"e3e6372085103461a2828509236bdbd47361dcb86a73d0c276e3e6e89ef91891"} Nov 25 15:29:51 crc kubenswrapper[4965]: I1125 15:29:51.206172 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6e7a54ba-ed58-416f-9336-ea8ae976cbfa","Type":"ContainerStarted","Data":"4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f"} Nov 25 15:29:51 crc kubenswrapper[4965]: I1125 15:29:51.206251 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6e7a54ba-ed58-416f-9336-ea8ae976cbfa","Type":"ContainerStarted","Data":"20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1"} Nov 25 15:29:51 crc kubenswrapper[4965]: I1125 15:29:51.206268 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6e7a54ba-ed58-416f-9336-ea8ae976cbfa","Type":"ContainerStarted","Data":"2310f75c8828fb4739dbbefc32d8068351293383b866cb4292746c4822af90cc"} Nov 25 15:29:51 crc kubenswrapper[4965]: I1125 15:29:51.247897 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.247880159 podStartE2EDuration="2.247880159s" podCreationTimestamp="2025-11-25 15:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:29:51.242552614 +0000 UTC m=+1536.210146370" watchObservedRunningTime="2025-11-25 15:29:51.247880159 +0000 UTC m=+1536.215473905" Nov 25 15:29:51 crc kubenswrapper[4965]: I1125 15:29:51.502723 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.568721 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.740322 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-combined-ca-bundle\") pod \"814b2e75-fe75-4874-b493-f739902f9202\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.740418 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-config-data\") pod \"814b2e75-fe75-4874-b493-f739902f9202\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.740477 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77d8r\" (UniqueName: \"kubernetes.io/projected/814b2e75-fe75-4874-b493-f739902f9202-kube-api-access-77d8r\") pod \"814b2e75-fe75-4874-b493-f739902f9202\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.740579 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-scripts\") pod \"814b2e75-fe75-4874-b493-f739902f9202\" (UID: \"814b2e75-fe75-4874-b493-f739902f9202\") " Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.745726 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-scripts" (OuterVolumeSpecName: "scripts") pod "814b2e75-fe75-4874-b493-f739902f9202" (UID: "814b2e75-fe75-4874-b493-f739902f9202"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.746084 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814b2e75-fe75-4874-b493-f739902f9202-kube-api-access-77d8r" (OuterVolumeSpecName: "kube-api-access-77d8r") pod "814b2e75-fe75-4874-b493-f739902f9202" (UID: "814b2e75-fe75-4874-b493-f739902f9202"). InnerVolumeSpecName "kube-api-access-77d8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.778040 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-config-data" (OuterVolumeSpecName: "config-data") pod "814b2e75-fe75-4874-b493-f739902f9202" (UID: "814b2e75-fe75-4874-b493-f739902f9202"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.781527 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "814b2e75-fe75-4874-b493-f739902f9202" (UID: "814b2e75-fe75-4874-b493-f739902f9202"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.844110 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.844161 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.844174 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77d8r\" (UniqueName: \"kubernetes.io/projected/814b2e75-fe75-4874-b493-f739902f9202-kube-api-access-77d8r\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:52 crc kubenswrapper[4965]: I1125 15:29:52.844188 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/814b2e75-fe75-4874-b493-f739902f9202-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.226546 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r7rws" event={"ID":"814b2e75-fe75-4874-b493-f739902f9202","Type":"ContainerDied","Data":"ed481eb3ce30f2137a54ae751deaa04eb094f580528e150af5ce7c6c578f0cb4"} Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.226584 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed481eb3ce30f2137a54ae751deaa04eb094f580528e150af5ce7c6c578f0cb4" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.226662 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r7rws" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.260772 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.260857 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.320898 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 15:29:53 crc kubenswrapper[4965]: E1125 15:29:53.321790 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814b2e75-fe75-4874-b493-f739902f9202" containerName="nova-cell1-conductor-db-sync" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.321822 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="814b2e75-fe75-4874-b493-f739902f9202" containerName="nova-cell1-conductor-db-sync" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.322245 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="814b2e75-fe75-4874-b493-f739902f9202" containerName="nova-cell1-conductor-db-sync" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.323697 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.327155 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.350587 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.370275 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb126d7-e0c4-4273-9ede-3a1425a6e36c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dfb126d7-e0c4-4273-9ede-3a1425a6e36c\") " pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.370535 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdjpd\" (UniqueName: \"kubernetes.io/projected/dfb126d7-e0c4-4273-9ede-3a1425a6e36c-kube-api-access-vdjpd\") pod \"nova-cell1-conductor-0\" (UID: \"dfb126d7-e0c4-4273-9ede-3a1425a6e36c\") " pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.370642 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb126d7-e0c4-4273-9ede-3a1425a6e36c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dfb126d7-e0c4-4273-9ede-3a1425a6e36c\") " pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.471752 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb126d7-e0c4-4273-9ede-3a1425a6e36c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dfb126d7-e0c4-4273-9ede-3a1425a6e36c\") " pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.472302 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb126d7-e0c4-4273-9ede-3a1425a6e36c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dfb126d7-e0c4-4273-9ede-3a1425a6e36c\") " pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.472370 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdjpd\" (UniqueName: \"kubernetes.io/projected/dfb126d7-e0c4-4273-9ede-3a1425a6e36c-kube-api-access-vdjpd\") pod \"nova-cell1-conductor-0\" (UID: \"dfb126d7-e0c4-4273-9ede-3a1425a6e36c\") " pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.481623 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb126d7-e0c4-4273-9ede-3a1425a6e36c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dfb126d7-e0c4-4273-9ede-3a1425a6e36c\") " pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.492718 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb126d7-e0c4-4273-9ede-3a1425a6e36c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dfb126d7-e0c4-4273-9ede-3a1425a6e36c\") " pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.500916 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdjpd\" (UniqueName: \"kubernetes.io/projected/dfb126d7-e0c4-4273-9ede-3a1425a6e36c-kube-api-access-vdjpd\") pod \"nova-cell1-conductor-0\" (UID: \"dfb126d7-e0c4-4273-9ede-3a1425a6e36c\") " pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.563371 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.563481 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 15:29:53 crc kubenswrapper[4965]: I1125 15:29:53.649831 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:54 crc kubenswrapper[4965]: I1125 15:29:54.176667 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 15:29:54 crc kubenswrapper[4965]: I1125 15:29:54.238869 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dfb126d7-e0c4-4273-9ede-3a1425a6e36c","Type":"ContainerStarted","Data":"80cff30380ee9643ec29d9e4f3bc5947364d29afb6c5fdedd6a9545e818b2640"} Nov 25 15:29:55 crc kubenswrapper[4965]: I1125 15:29:55.249032 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dfb126d7-e0c4-4273-9ede-3a1425a6e36c","Type":"ContainerStarted","Data":"0a93782fed4f40932bd3dffab915fee7304c63176f4e54ebe181db71ded69a8c"} Nov 25 15:29:55 crc kubenswrapper[4965]: I1125 15:29:55.250249 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 25 15:29:56 crc kubenswrapper[4965]: I1125 15:29:56.502459 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 15:29:56 crc kubenswrapper[4965]: I1125 15:29:56.552716 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 15:29:56 crc kubenswrapper[4965]: I1125 15:29:56.573903 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.573880169 podStartE2EDuration="3.573880169s" podCreationTimestamp="2025-11-25 15:29:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:29:55.273569511 +0000 UTC m=+1540.241163267" watchObservedRunningTime="2025-11-25 15:29:56.573880169 +0000 UTC m=+1541.541473925" Nov 25 15:29:57 crc kubenswrapper[4965]: I1125 15:29:57.312419 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 15:29:58 crc kubenswrapper[4965]: I1125 15:29:58.563746 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 15:29:58 crc kubenswrapper[4965]: I1125 15:29:58.564764 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 15:29:59 crc kubenswrapper[4965]: I1125 15:29:59.579177 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 15:29:59 crc kubenswrapper[4965]: I1125 15:29:59.579207 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 15:29:59 crc kubenswrapper[4965]: I1125 15:29:59.889115 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 15:29:59 crc kubenswrapper[4965]: I1125 15:29:59.890703 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.147096 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s"] Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.150087 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.153061 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.153411 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.175152 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s"] Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.330364 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c945a952-8311-48c4-826f-db1aa5659c0c-config-volume\") pod \"collect-profiles-29401410-z5n7s\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.330531 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgkq7\" (UniqueName: \"kubernetes.io/projected/c945a952-8311-48c4-826f-db1aa5659c0c-kube-api-access-pgkq7\") pod \"collect-profiles-29401410-z5n7s\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.330569 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c945a952-8311-48c4-826f-db1aa5659c0c-secret-volume\") pod \"collect-profiles-29401410-z5n7s\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.432031 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c945a952-8311-48c4-826f-db1aa5659c0c-config-volume\") pod \"collect-profiles-29401410-z5n7s\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.432190 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgkq7\" (UniqueName: \"kubernetes.io/projected/c945a952-8311-48c4-826f-db1aa5659c0c-kube-api-access-pgkq7\") pod \"collect-profiles-29401410-z5n7s\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.432240 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c945a952-8311-48c4-826f-db1aa5659c0c-secret-volume\") pod \"collect-profiles-29401410-z5n7s\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.433303 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c945a952-8311-48c4-826f-db1aa5659c0c-config-volume\") pod \"collect-profiles-29401410-z5n7s\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.448624 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c945a952-8311-48c4-826f-db1aa5659c0c-secret-volume\") pod \"collect-profiles-29401410-z5n7s\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.455570 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgkq7\" (UniqueName: \"kubernetes.io/projected/c945a952-8311-48c4-826f-db1aa5659c0c-kube-api-access-pgkq7\") pod \"collect-profiles-29401410-z5n7s\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.480465 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.952679 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s"] Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.982117 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:30:00 crc kubenswrapper[4965]: I1125 15:30:00.983029 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:30:01 crc kubenswrapper[4965]: I1125 15:30:01.302294 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" event={"ID":"c945a952-8311-48c4-826f-db1aa5659c0c","Type":"ContainerStarted","Data":"73a86910ae414be847922fdca6d11e1c699f451d54e5270246f9badf7a6031d8"} Nov 25 15:30:01 crc kubenswrapper[4965]: I1125 15:30:01.302584 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" event={"ID":"c945a952-8311-48c4-826f-db1aa5659c0c","Type":"ContainerStarted","Data":"0266652a47aef712dfc5d53ae00dce48d782b740f0029d0d7426f71ec6c44536"} Nov 25 15:30:01 crc kubenswrapper[4965]: I1125 15:30:01.317266 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" podStartSLOduration=1.3172499819999999 podStartE2EDuration="1.317249982s" podCreationTimestamp="2025-11-25 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:30:01.314763484 +0000 UTC m=+1546.282357230" watchObservedRunningTime="2025-11-25 15:30:01.317249982 +0000 UTC m=+1546.284843728" Nov 25 15:30:02 crc kubenswrapper[4965]: I1125 15:30:02.311888 4965 generic.go:334] "Generic (PLEG): container finished" podID="c945a952-8311-48c4-826f-db1aa5659c0c" containerID="73a86910ae414be847922fdca6d11e1c699f451d54e5270246f9badf7a6031d8" exitCode=0 Nov 25 15:30:02 crc kubenswrapper[4965]: I1125 15:30:02.312007 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" event={"ID":"c945a952-8311-48c4-826f-db1aa5659c0c","Type":"ContainerDied","Data":"73a86910ae414be847922fdca6d11e1c699f451d54e5270246f9badf7a6031d8"} Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.685206 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.692746 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.795518 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c945a952-8311-48c4-826f-db1aa5659c0c-secret-volume\") pod \"c945a952-8311-48c4-826f-db1aa5659c0c\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.795641 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgkq7\" (UniqueName: \"kubernetes.io/projected/c945a952-8311-48c4-826f-db1aa5659c0c-kube-api-access-pgkq7\") pod \"c945a952-8311-48c4-826f-db1aa5659c0c\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.795731 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c945a952-8311-48c4-826f-db1aa5659c0c-config-volume\") pod \"c945a952-8311-48c4-826f-db1aa5659c0c\" (UID: \"c945a952-8311-48c4-826f-db1aa5659c0c\") " Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.796649 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c945a952-8311-48c4-826f-db1aa5659c0c-config-volume" (OuterVolumeSpecName: "config-volume") pod "c945a952-8311-48c4-826f-db1aa5659c0c" (UID: "c945a952-8311-48c4-826f-db1aa5659c0c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.800880 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c945a952-8311-48c4-826f-db1aa5659c0c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c945a952-8311-48c4-826f-db1aa5659c0c" (UID: "c945a952-8311-48c4-826f-db1aa5659c0c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.807143 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c945a952-8311-48c4-826f-db1aa5659c0c-kube-api-access-pgkq7" (OuterVolumeSpecName: "kube-api-access-pgkq7") pod "c945a952-8311-48c4-826f-db1aa5659c0c" (UID: "c945a952-8311-48c4-826f-db1aa5659c0c"). InnerVolumeSpecName "kube-api-access-pgkq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.898426 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgkq7\" (UniqueName: \"kubernetes.io/projected/c945a952-8311-48c4-826f-db1aa5659c0c-kube-api-access-pgkq7\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.898462 4965 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c945a952-8311-48c4-826f-db1aa5659c0c-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:03 crc kubenswrapper[4965]: I1125 15:30:03.898473 4965 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c945a952-8311-48c4-826f-db1aa5659c0c-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:04 crc kubenswrapper[4965]: I1125 15:30:04.328779 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" event={"ID":"c945a952-8311-48c4-826f-db1aa5659c0c","Type":"ContainerDied","Data":"0266652a47aef712dfc5d53ae00dce48d782b740f0029d0d7426f71ec6c44536"} Nov 25 15:30:04 crc kubenswrapper[4965]: I1125 15:30:04.328816 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0266652a47aef712dfc5d53ae00dce48d782b740f0029d0d7426f71ec6c44536" Nov 25 15:30:04 crc kubenswrapper[4965]: I1125 15:30:04.329152 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-z5n7s" Nov 25 15:30:08 crc kubenswrapper[4965]: I1125 15:30:08.568260 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 15:30:08 crc kubenswrapper[4965]: I1125 15:30:08.570106 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 15:30:08 crc kubenswrapper[4965]: I1125 15:30:08.574186 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 15:30:09 crc kubenswrapper[4965]: I1125 15:30:09.379653 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 15:30:09 crc kubenswrapper[4965]: I1125 15:30:09.892123 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 15:30:09 crc kubenswrapper[4965]: I1125 15:30:09.893551 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 15:30:09 crc kubenswrapper[4965]: I1125 15:30:09.895938 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 15:30:09 crc kubenswrapper[4965]: I1125 15:30:09.896698 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.307980 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.382493 4965 generic.go:334] "Generic (PLEG): container finished" podID="a3094974-d007-468c-8c97-455d28c6f1ff" containerID="5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7" exitCode=137 Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.383064 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3094974-d007-468c-8c97-455d28c6f1ff","Type":"ContainerDied","Data":"5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7"} Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.383106 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3094974-d007-468c-8c97-455d28c6f1ff","Type":"ContainerDied","Data":"46dc8a213f8c43af39f90cc38c625a7102e97387605b733483511485403751be"} Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.383124 4965 scope.go:117] "RemoveContainer" containerID="5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.383133 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.383512 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.388470 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.417664 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-combined-ca-bundle\") pod \"a3094974-d007-468c-8c97-455d28c6f1ff\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.418078 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhv6r\" (UniqueName: \"kubernetes.io/projected/a3094974-d007-468c-8c97-455d28c6f1ff-kube-api-access-nhv6r\") pod \"a3094974-d007-468c-8c97-455d28c6f1ff\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.418196 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-config-data\") pod \"a3094974-d007-468c-8c97-455d28c6f1ff\" (UID: \"a3094974-d007-468c-8c97-455d28c6f1ff\") " Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.420136 4965 scope.go:117] "RemoveContainer" containerID="5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7" Nov 25 15:30:10 crc kubenswrapper[4965]: E1125 15:30:10.420463 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7\": container with ID starting with 5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7 not found: ID does not exist" containerID="5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.420498 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7"} err="failed to get container status \"5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7\": rpc error: code = NotFound desc = could not find container \"5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7\": container with ID starting with 5cf0fc03c269e91ac32cbcb16aed535187071b0d04b4f96fcb3170b4b42bc3a7 not found: ID does not exist" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.425290 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3094974-d007-468c-8c97-455d28c6f1ff-kube-api-access-nhv6r" (OuterVolumeSpecName: "kube-api-access-nhv6r") pod "a3094974-d007-468c-8c97-455d28c6f1ff" (UID: "a3094974-d007-468c-8c97-455d28c6f1ff"). InnerVolumeSpecName "kube-api-access-nhv6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.483793 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-config-data" (OuterVolumeSpecName: "config-data") pod "a3094974-d007-468c-8c97-455d28c6f1ff" (UID: "a3094974-d007-468c-8c97-455d28c6f1ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.486136 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3094974-d007-468c-8c97-455d28c6f1ff" (UID: "a3094974-d007-468c-8c97-455d28c6f1ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.520756 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.520942 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhv6r\" (UniqueName: \"kubernetes.io/projected/a3094974-d007-468c-8c97-455d28c6f1ff-kube-api-access-nhv6r\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.521026 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3094974-d007-468c-8c97-455d28c6f1ff-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.591287 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-st74t"] Nov 25 15:30:10 crc kubenswrapper[4965]: E1125 15:30:10.591730 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3094974-d007-468c-8c97-455d28c6f1ff" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.591747 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3094974-d007-468c-8c97-455d28c6f1ff" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 15:30:10 crc kubenswrapper[4965]: E1125 15:30:10.591768 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c945a952-8311-48c4-826f-db1aa5659c0c" containerName="collect-profiles" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.591777 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c945a952-8311-48c4-826f-db1aa5659c0c" containerName="collect-profiles" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.592727 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c945a952-8311-48c4-826f-db1aa5659c0c" containerName="collect-profiles" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.592758 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3094974-d007-468c-8c97-455d28c6f1ff" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.594435 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.619716 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-st74t"] Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.726662 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.726773 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-dns-svc\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.726849 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.726926 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.726953 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-755jh\" (UniqueName: \"kubernetes.io/projected/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-kube-api-access-755jh\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.727034 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-config\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.738898 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.750512 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.751737 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.757958 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.758316 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.758470 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.758568 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.785779 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3094974-d007-468c-8c97-455d28c6f1ff" path="/var/lib/kubelet/pods/a3094974-d007-468c-8c97-455d28c6f1ff/volumes" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.828424 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.828543 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-dns-svc\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.828582 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.828609 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.828638 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.828656 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.828673 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-755jh\" (UniqueName: \"kubernetes.io/projected/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-kube-api-access-755jh\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.828693 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-config\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.828711 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.828741 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwkxd\" (UniqueName: \"kubernetes.io/projected/90d758e5-3565-44c4-9243-6f331e5eabf0-kube-api-access-bwkxd\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.829852 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.829939 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-dns-svc\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.830340 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.830715 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-config\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.846991 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-755jh\" (UniqueName: \"kubernetes.io/projected/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-kube-api-access-755jh\") pod \"dnsmasq-dns-5b856c5697-st74t\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.921115 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.936574 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.938817 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.939016 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.939335 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwkxd\" (UniqueName: \"kubernetes.io/projected/90d758e5-3565-44c4-9243-6f331e5eabf0-kube-api-access-bwkxd\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.939482 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.940837 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.943661 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.946355 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.946388 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d758e5-3565-44c4-9243-6f331e5eabf0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:10 crc kubenswrapper[4965]: I1125 15:30:10.959886 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwkxd\" (UniqueName: \"kubernetes.io/projected/90d758e5-3565-44c4-9243-6f331e5eabf0-kube-api-access-bwkxd\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d758e5-3565-44c4-9243-6f331e5eabf0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:11 crc kubenswrapper[4965]: I1125 15:30:11.097755 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:11 crc kubenswrapper[4965]: I1125 15:30:11.483851 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-st74t"] Nov 25 15:30:11 crc kubenswrapper[4965]: I1125 15:30:11.710197 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.416416 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"90d758e5-3565-44c4-9243-6f331e5eabf0","Type":"ContainerStarted","Data":"c1f67e6a80868d1a7a81200eb66823310faaa35156b5961e6d3c52288e1a79d1"} Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.416466 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"90d758e5-3565-44c4-9243-6f331e5eabf0","Type":"ContainerStarted","Data":"9142d0a427f47db51c2aaed756ea6aa0c242e56605bd65616c551da4a2ea83ee"} Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.419328 4965 generic.go:334] "Generic (PLEG): container finished" podID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" containerID="26a860b1750ff11e609f4b8aa8615f1dbc39a54e5475d8be15e6155680b112fc" exitCode=0 Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.419406 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-st74t" event={"ID":"9e098e76-ffb9-40ad-9312-96f70fc8d0d7","Type":"ContainerDied","Data":"26a860b1750ff11e609f4b8aa8615f1dbc39a54e5475d8be15e6155680b112fc"} Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.419455 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-st74t" event={"ID":"9e098e76-ffb9-40ad-9312-96f70fc8d0d7","Type":"ContainerStarted","Data":"0004dca969bce8ac84ed56b8a60152c55a3a94af88c92c8b849843aa6cfa17f1"} Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.487473 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.487451042 podStartE2EDuration="2.487451042s" podCreationTimestamp="2025-11-25 15:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:30:12.447288906 +0000 UTC m=+1557.414882662" watchObservedRunningTime="2025-11-25 15:30:12.487451042 +0000 UTC m=+1557.455044778" Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.498383 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.498729 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="ceilometer-central-agent" containerID="cri-o://f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118" gracePeriod=30 Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.499048 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="proxy-httpd" containerID="cri-o://aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d" gracePeriod=30 Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.499179 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="ceilometer-notification-agent" containerID="cri-o://28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93" gracePeriod=30 Nov 25 15:30:12 crc kubenswrapper[4965]: I1125 15:30:12.499248 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="sg-core" containerID="cri-o://cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49" gracePeriod=30 Nov 25 15:30:13 crc kubenswrapper[4965]: I1125 15:30:13.101290 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:13 crc kubenswrapper[4965]: I1125 15:30:13.429822 4965 generic.go:334] "Generic (PLEG): container finished" podID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerID="aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d" exitCode=0 Nov 25 15:30:13 crc kubenswrapper[4965]: I1125 15:30:13.429848 4965 generic.go:334] "Generic (PLEG): container finished" podID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerID="cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49" exitCode=2 Nov 25 15:30:13 crc kubenswrapper[4965]: I1125 15:30:13.429855 4965 generic.go:334] "Generic (PLEG): container finished" podID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerID="f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118" exitCode=0 Nov 25 15:30:13 crc kubenswrapper[4965]: I1125 15:30:13.429890 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79bce5f5-733f-4651-8fa2-6756ff1993ee","Type":"ContainerDied","Data":"aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d"} Nov 25 15:30:13 crc kubenswrapper[4965]: I1125 15:30:13.429936 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79bce5f5-733f-4651-8fa2-6756ff1993ee","Type":"ContainerDied","Data":"cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49"} Nov 25 15:30:13 crc kubenswrapper[4965]: I1125 15:30:13.429946 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79bce5f5-733f-4651-8fa2-6756ff1993ee","Type":"ContainerDied","Data":"f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118"} Nov 25 15:30:13 crc kubenswrapper[4965]: I1125 15:30:13.431705 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-st74t" event={"ID":"9e098e76-ffb9-40ad-9312-96f70fc8d0d7","Type":"ContainerStarted","Data":"4fd1f88c2e262bf818afd5b86c36aa1f88445dcc8b1707479ca612416cd8d984"} Nov 25 15:30:13 crc kubenswrapper[4965]: I1125 15:30:13.432583 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerName="nova-api-log" containerID="cri-o://20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1" gracePeriod=30 Nov 25 15:30:13 crc kubenswrapper[4965]: I1125 15:30:13.432713 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerName="nova-api-api" containerID="cri-o://4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f" gracePeriod=30 Nov 25 15:30:14 crc kubenswrapper[4965]: I1125 15:30:14.443140 4965 generic.go:334] "Generic (PLEG): container finished" podID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerID="20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1" exitCode=143 Nov 25 15:30:14 crc kubenswrapper[4965]: I1125 15:30:14.443234 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6e7a54ba-ed58-416f-9336-ea8ae976cbfa","Type":"ContainerDied","Data":"20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1"} Nov 25 15:30:14 crc kubenswrapper[4965]: I1125 15:30:14.459320 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:16 crc kubenswrapper[4965]: I1125 15:30:16.098703 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.222448 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.240197 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-st74t" podStartSLOduration=7.24017925 podStartE2EDuration="7.24017925s" podCreationTimestamp="2025-11-25 15:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:30:13.459768101 +0000 UTC m=+1558.427361847" watchObservedRunningTime="2025-11-25 15:30:17.24017925 +0000 UTC m=+1562.207772996" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.372290 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-logs\") pod \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.372349 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr7hp\" (UniqueName: \"kubernetes.io/projected/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-kube-api-access-nr7hp\") pod \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.372394 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-combined-ca-bundle\") pod \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.372455 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-config-data\") pod \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\" (UID: \"6e7a54ba-ed58-416f-9336-ea8ae976cbfa\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.372805 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-logs" (OuterVolumeSpecName: "logs") pod "6e7a54ba-ed58-416f-9336-ea8ae976cbfa" (UID: "6e7a54ba-ed58-416f-9336-ea8ae976cbfa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.381016 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-kube-api-access-nr7hp" (OuterVolumeSpecName: "kube-api-access-nr7hp") pod "6e7a54ba-ed58-416f-9336-ea8ae976cbfa" (UID: "6e7a54ba-ed58-416f-9336-ea8ae976cbfa"). InnerVolumeSpecName "kube-api-access-nr7hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.399872 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e7a54ba-ed58-416f-9336-ea8ae976cbfa" (UID: "6e7a54ba-ed58-416f-9336-ea8ae976cbfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.403672 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-config-data" (OuterVolumeSpecName: "config-data") pod "6e7a54ba-ed58-416f-9336-ea8ae976cbfa" (UID: "6e7a54ba-ed58-416f-9336-ea8ae976cbfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.450755 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.473568 4965 generic.go:334] "Generic (PLEG): container finished" podID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerID="4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f" exitCode=0 Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.473629 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6e7a54ba-ed58-416f-9336-ea8ae976cbfa","Type":"ContainerDied","Data":"4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f"} Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.473662 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6e7a54ba-ed58-416f-9336-ea8ae976cbfa","Type":"ContainerDied","Data":"2310f75c8828fb4739dbbefc32d8068351293383b866cb4292746c4822af90cc"} Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.473678 4965 scope.go:117] "RemoveContainer" containerID="4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.473800 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.479325 4965 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.479351 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr7hp\" (UniqueName: \"kubernetes.io/projected/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-kube-api-access-nr7hp\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.479362 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.479370 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7a54ba-ed58-416f-9336-ea8ae976cbfa-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.483315 4965 generic.go:334] "Generic (PLEG): container finished" podID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerID="28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93" exitCode=0 Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.483353 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79bce5f5-733f-4651-8fa2-6756ff1993ee","Type":"ContainerDied","Data":"28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93"} Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.483379 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79bce5f5-733f-4651-8fa2-6756ff1993ee","Type":"ContainerDied","Data":"9372651e62955a133413d6cfaadb5e3fce62b8424b268c70d89f2fada9953efe"} Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.483475 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.509386 4965 scope.go:117] "RemoveContainer" containerID="20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.530910 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.556939 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.576514 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.576870 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="proxy-httpd" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.576884 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="proxy-httpd" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.576898 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerName="nova-api-log" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.576904 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerName="nova-api-log" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.576914 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="ceilometer-notification-agent" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.576920 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="ceilometer-notification-agent" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.576936 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="sg-core" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.576942 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="sg-core" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.576953 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="ceilometer-central-agent" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.576958 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="ceilometer-central-agent" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.576990 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerName="nova-api-api" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.576995 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerName="nova-api-api" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.577133 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="ceilometer-notification-agent" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.577150 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="sg-core" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.577166 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerName="nova-api-log" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.577176 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="ceilometer-central-agent" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.577185 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" containerName="nova-api-api" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.577191 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" containerName="proxy-httpd" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.580010 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-combined-ca-bundle\") pod \"79bce5f5-733f-4651-8fa2-6756ff1993ee\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.580054 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-scripts\") pod \"79bce5f5-733f-4651-8fa2-6756ff1993ee\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.580081 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-run-httpd\") pod \"79bce5f5-733f-4651-8fa2-6756ff1993ee\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.580120 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-config-data\") pod \"79bce5f5-733f-4651-8fa2-6756ff1993ee\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.580161 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-ceilometer-tls-certs\") pod \"79bce5f5-733f-4651-8fa2-6756ff1993ee\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.580195 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29c26\" (UniqueName: \"kubernetes.io/projected/79bce5f5-733f-4651-8fa2-6756ff1993ee-kube-api-access-29c26\") pod \"79bce5f5-733f-4651-8fa2-6756ff1993ee\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.580238 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-sg-core-conf-yaml\") pod \"79bce5f5-733f-4651-8fa2-6756ff1993ee\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.580328 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-log-httpd\") pod \"79bce5f5-733f-4651-8fa2-6756ff1993ee\" (UID: \"79bce5f5-733f-4651-8fa2-6756ff1993ee\") " Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.581118 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "79bce5f5-733f-4651-8fa2-6756ff1993ee" (UID: "79bce5f5-733f-4651-8fa2-6756ff1993ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.583335 4965 scope.go:117] "RemoveContainer" containerID="4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.586696 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f\": container with ID starting with 4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f not found: ID does not exist" containerID="4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.586737 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f"} err="failed to get container status \"4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f\": rpc error: code = NotFound desc = could not find container \"4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f\": container with ID starting with 4ab6f334f698f58b9bfe106c7193411bd898d49b17288df5c52fa846cb79477f not found: ID does not exist" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.586761 4965 scope.go:117] "RemoveContainer" containerID="20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.587233 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.588872 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "79bce5f5-733f-4651-8fa2-6756ff1993ee" (UID: "79bce5f5-733f-4651-8fa2-6756ff1993ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.594524 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.594718 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.594824 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.595532 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1\": container with ID starting with 20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1 not found: ID does not exist" containerID="20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.595630 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1"} err="failed to get container status \"20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1\": rpc error: code = NotFound desc = could not find container \"20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1\": container with ID starting with 20dea860f07afddc362ddbfeff99e16982a52af1036f0faa845f7c4320e376e1 not found: ID does not exist" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.595728 4965 scope.go:117] "RemoveContainer" containerID="aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.617558 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79bce5f5-733f-4651-8fa2-6756ff1993ee-kube-api-access-29c26" (OuterVolumeSpecName: "kube-api-access-29c26") pod "79bce5f5-733f-4651-8fa2-6756ff1993ee" (UID: "79bce5f5-733f-4651-8fa2-6756ff1993ee"). InnerVolumeSpecName "kube-api-access-29c26". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.627228 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.648637 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-scripts" (OuterVolumeSpecName: "scripts") pod "79bce5f5-733f-4651-8fa2-6756ff1993ee" (UID: "79bce5f5-733f-4651-8fa2-6756ff1993ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.651801 4965 scope.go:117] "RemoveContainer" containerID="cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.672195 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "79bce5f5-733f-4651-8fa2-6756ff1993ee" (UID: "79bce5f5-733f-4651-8fa2-6756ff1993ee"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.674900 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "79bce5f5-733f-4651-8fa2-6756ff1993ee" (UID: "79bce5f5-733f-4651-8fa2-6756ff1993ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.682872 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-config-data\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.682925 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebab39c1-5830-4a94-928c-15f90cd85d67-logs\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.682945 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-public-tls-certs\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.682998 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.683020 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.683135 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzplw\" (UniqueName: \"kubernetes.io/projected/ebab39c1-5830-4a94-928c-15f90cd85d67-kube-api-access-qzplw\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.683407 4965 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.683426 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.683435 4965 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79bce5f5-733f-4651-8fa2-6756ff1993ee-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.683445 4965 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.683456 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29c26\" (UniqueName: \"kubernetes.io/projected/79bce5f5-733f-4651-8fa2-6756ff1993ee-kube-api-access-29c26\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.683467 4965 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.699667 4965 scope.go:117] "RemoveContainer" containerID="28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.723449 4965 scope.go:117] "RemoveContainer" containerID="f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.731127 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79bce5f5-733f-4651-8fa2-6756ff1993ee" (UID: "79bce5f5-733f-4651-8fa2-6756ff1993ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.750701 4965 scope.go:117] "RemoveContainer" containerID="aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.751338 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d\": container with ID starting with aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d not found: ID does not exist" containerID="aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.751375 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d"} err="failed to get container status \"aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d\": rpc error: code = NotFound desc = could not find container \"aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d\": container with ID starting with aea5252b4cdc96d3841574583bdc2ea01fb4a21d1a251a274f2df2e6972c378d not found: ID does not exist" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.751403 4965 scope.go:117] "RemoveContainer" containerID="cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.755456 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49\": container with ID starting with cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49 not found: ID does not exist" containerID="cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.755484 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49"} err="failed to get container status \"cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49\": rpc error: code = NotFound desc = could not find container \"cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49\": container with ID starting with cdebe319707df99f6ed6a834d2d182f9140a9d1fdda4dba69d2b914d70d0ca49 not found: ID does not exist" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.755499 4965 scope.go:117] "RemoveContainer" containerID="28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.756030 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93\": container with ID starting with 28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93 not found: ID does not exist" containerID="28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.756066 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93"} err="failed to get container status \"28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93\": rpc error: code = NotFound desc = could not find container \"28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93\": container with ID starting with 28aad851448a816bfef3359c213cfe5e724470b2253b38e33de8025ecd859c93 not found: ID does not exist" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.756079 4965 scope.go:117] "RemoveContainer" containerID="f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118" Nov 25 15:30:17 crc kubenswrapper[4965]: E1125 15:30:17.756415 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118\": container with ID starting with f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118 not found: ID does not exist" containerID="f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.756438 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118"} err="failed to get container status \"f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118\": rpc error: code = NotFound desc = could not find container \"f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118\": container with ID starting with f5680ba7dea33644d799587e43cb0e138cadc963cb3bad7a1fb983e7786ae118 not found: ID does not exist" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.786293 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.786362 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzplw\" (UniqueName: \"kubernetes.io/projected/ebab39c1-5830-4a94-928c-15f90cd85d67-kube-api-access-qzplw\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.786446 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-config-data\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.786478 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebab39c1-5830-4a94-928c-15f90cd85d67-logs\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.786499 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-public-tls-certs\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.786534 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.797124 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.807014 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebab39c1-5830-4a94-928c-15f90cd85d67-logs\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.814358 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-public-tls-certs\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.814696 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.816700 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-config-data\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.820621 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.823436 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzplw\" (UniqueName: \"kubernetes.io/projected/ebab39c1-5830-4a94-928c-15f90cd85d67-kube-api-access-qzplw\") pod \"nova-api-0\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " pod="openstack/nova-api-0" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.833826 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-config-data" (OuterVolumeSpecName: "config-data") pod "79bce5f5-733f-4651-8fa2-6756ff1993ee" (UID: "79bce5f5-733f-4651-8fa2-6756ff1993ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.899447 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79bce5f5-733f-4651-8fa2-6756ff1993ee-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:17 crc kubenswrapper[4965]: I1125 15:30:17.921858 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.125889 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.139437 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.171028 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.173105 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.176049 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.176173 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.177578 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.181316 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.305842 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fmmb\" (UniqueName: \"kubernetes.io/projected/db88222c-47b1-4187-9794-50f067ffdc89-kube-api-access-7fmmb\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.305927 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.306124 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db88222c-47b1-4187-9794-50f067ffdc89-run-httpd\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.306209 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.306272 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-scripts\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.306345 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.306412 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db88222c-47b1-4187-9794-50f067ffdc89-log-httpd\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.306560 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-config-data\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.408241 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db88222c-47b1-4187-9794-50f067ffdc89-run-httpd\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.408305 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.408336 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-scripts\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.408366 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.408402 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db88222c-47b1-4187-9794-50f067ffdc89-log-httpd\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.408455 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-config-data\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.408491 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fmmb\" (UniqueName: \"kubernetes.io/projected/db88222c-47b1-4187-9794-50f067ffdc89-kube-api-access-7fmmb\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.408521 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.408841 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db88222c-47b1-4187-9794-50f067ffdc89-run-httpd\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.409144 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db88222c-47b1-4187-9794-50f067ffdc89-log-httpd\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.415112 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-config-data\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.415791 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.416102 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-scripts\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.416446 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.420354 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db88222c-47b1-4187-9794-50f067ffdc89-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.436712 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.438933 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fmmb\" (UniqueName: \"kubernetes.io/projected/db88222c-47b1-4187-9794-50f067ffdc89-kube-api-access-7fmmb\") pod \"ceilometer-0\" (UID: \"db88222c-47b1-4187-9794-50f067ffdc89\") " pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.496056 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebab39c1-5830-4a94-928c-15f90cd85d67","Type":"ContainerStarted","Data":"e29e276543177c1899d3a1f9b547591e7579d6ff2e062bb5bcac24710c67cdaf"} Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.502032 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.788776 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e7a54ba-ed58-416f-9336-ea8ae976cbfa" path="/var/lib/kubelet/pods/6e7a54ba-ed58-416f-9336-ea8ae976cbfa/volumes" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.790600 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79bce5f5-733f-4651-8fa2-6756ff1993ee" path="/var/lib/kubelet/pods/79bce5f5-733f-4651-8fa2-6756ff1993ee/volumes" Nov 25 15:30:18 crc kubenswrapper[4965]: I1125 15:30:18.988354 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 15:30:19 crc kubenswrapper[4965]: I1125 15:30:19.506393 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebab39c1-5830-4a94-928c-15f90cd85d67","Type":"ContainerStarted","Data":"b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409"} Nov 25 15:30:19 crc kubenswrapper[4965]: I1125 15:30:19.506445 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebab39c1-5830-4a94-928c-15f90cd85d67","Type":"ContainerStarted","Data":"0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae"} Nov 25 15:30:19 crc kubenswrapper[4965]: I1125 15:30:19.509923 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db88222c-47b1-4187-9794-50f067ffdc89","Type":"ContainerStarted","Data":"e37543f502fd3aa7c46523b69ec5afa9ac119c21d04b2e604e8f9961c0a0aa63"} Nov 25 15:30:19 crc kubenswrapper[4965]: I1125 15:30:19.550853 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.550831137 podStartE2EDuration="2.550831137s" podCreationTimestamp="2025-11-25 15:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:30:19.53369841 +0000 UTC m=+1564.501292196" watchObservedRunningTime="2025-11-25 15:30:19.550831137 +0000 UTC m=+1564.518424883" Nov 25 15:30:20 crc kubenswrapper[4965]: I1125 15:30:20.523056 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db88222c-47b1-4187-9794-50f067ffdc89","Type":"ContainerStarted","Data":"a3fa404b8a9202968deac0ec17142db4e7c4c639450fb05313b4c5e80adfb740"} Nov 25 15:30:20 crc kubenswrapper[4965]: I1125 15:30:20.922572 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:30:20 crc kubenswrapper[4965]: I1125 15:30:20.994658 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-s6z25"] Nov 25 15:30:20 crc kubenswrapper[4965]: I1125 15:30:20.994949 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" podUID="db43ec48-b8c9-4a63-960d-aaf762b4e184" containerName="dnsmasq-dns" containerID="cri-o://25bd374fe6559789b413b1c6550728ef13fad5dfda716db96d0dd7739fff1a8d" gracePeriod=10 Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.098550 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.151447 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.553986 4965 generic.go:334] "Generic (PLEG): container finished" podID="db43ec48-b8c9-4a63-960d-aaf762b4e184" containerID="25bd374fe6559789b413b1c6550728ef13fad5dfda716db96d0dd7739fff1a8d" exitCode=0 Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.554349 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" event={"ID":"db43ec48-b8c9-4a63-960d-aaf762b4e184","Type":"ContainerDied","Data":"25bd374fe6559789b413b1c6550728ef13fad5dfda716db96d0dd7739fff1a8d"} Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.572080 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db88222c-47b1-4187-9794-50f067ffdc89","Type":"ContainerStarted","Data":"aa250fde52470887498e6f407053c0b8848a9241c5f2289c7df7ca687928dcc2"} Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.572132 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db88222c-47b1-4187-9794-50f067ffdc89","Type":"ContainerStarted","Data":"5af5d09136fce08383526e513a11cee7aec7d930a8a4f3714bb334a759865e7a"} Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.618038 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.741543 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.857218 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xs2g6"] Nov 25 15:30:21 crc kubenswrapper[4965]: E1125 15:30:21.857808 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db43ec48-b8c9-4a63-960d-aaf762b4e184" containerName="init" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.857873 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="db43ec48-b8c9-4a63-960d-aaf762b4e184" containerName="init" Nov 25 15:30:21 crc kubenswrapper[4965]: E1125 15:30:21.857926 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db43ec48-b8c9-4a63-960d-aaf762b4e184" containerName="dnsmasq-dns" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.857991 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="db43ec48-b8c9-4a63-960d-aaf762b4e184" containerName="dnsmasq-dns" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.858223 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="db43ec48-b8c9-4a63-960d-aaf762b4e184" containerName="dnsmasq-dns" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.858827 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.861735 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.862056 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.877352 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xs2g6"] Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.894912 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-dns-svc\") pod \"db43ec48-b8c9-4a63-960d-aaf762b4e184\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.895018 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-sb\") pod \"db43ec48-b8c9-4a63-960d-aaf762b4e184\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.895139 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9qtj\" (UniqueName: \"kubernetes.io/projected/db43ec48-b8c9-4a63-960d-aaf762b4e184-kube-api-access-s9qtj\") pod \"db43ec48-b8c9-4a63-960d-aaf762b4e184\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.895210 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-config\") pod \"db43ec48-b8c9-4a63-960d-aaf762b4e184\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.895244 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-nb\") pod \"db43ec48-b8c9-4a63-960d-aaf762b4e184\" (UID: \"db43ec48-b8c9-4a63-960d-aaf762b4e184\") " Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.932537 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db43ec48-b8c9-4a63-960d-aaf762b4e184-kube-api-access-s9qtj" (OuterVolumeSpecName: "kube-api-access-s9qtj") pod "db43ec48-b8c9-4a63-960d-aaf762b4e184" (UID: "db43ec48-b8c9-4a63-960d-aaf762b4e184"). InnerVolumeSpecName "kube-api-access-s9qtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.960740 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-config" (OuterVolumeSpecName: "config") pod "db43ec48-b8c9-4a63-960d-aaf762b4e184" (UID: "db43ec48-b8c9-4a63-960d-aaf762b4e184"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.997183 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.997309 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-scripts\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.997384 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl5jm\" (UniqueName: \"kubernetes.io/projected/2c0d36b9-701f-42e2-9a0d-03d0db171c59-kube-api-access-hl5jm\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.997422 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-config-data\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.997504 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9qtj\" (UniqueName: \"kubernetes.io/projected/db43ec48-b8c9-4a63-960d-aaf762b4e184-kube-api-access-s9qtj\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:21 crc kubenswrapper[4965]: I1125 15:30:21.997519 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.006325 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "db43ec48-b8c9-4a63-960d-aaf762b4e184" (UID: "db43ec48-b8c9-4a63-960d-aaf762b4e184"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.020864 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db43ec48-b8c9-4a63-960d-aaf762b4e184" (UID: "db43ec48-b8c9-4a63-960d-aaf762b4e184"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.022076 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "db43ec48-b8c9-4a63-960d-aaf762b4e184" (UID: "db43ec48-b8c9-4a63-960d-aaf762b4e184"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.099108 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl5jm\" (UniqueName: \"kubernetes.io/projected/2c0d36b9-701f-42e2-9a0d-03d0db171c59-kube-api-access-hl5jm\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.099182 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-config-data\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.099255 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.099374 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-scripts\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.099450 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.099467 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.099478 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db43ec48-b8c9-4a63-960d-aaf762b4e184-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.102940 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-scripts\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.105072 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.104249 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-config-data\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.123101 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl5jm\" (UniqueName: \"kubernetes.io/projected/2c0d36b9-701f-42e2-9a0d-03d0db171c59-kube-api-access-hl5jm\") pod \"nova-cell1-cell-mapping-xs2g6\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.173468 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.589470 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.590206 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-s6z25" event={"ID":"db43ec48-b8c9-4a63-960d-aaf762b4e184","Type":"ContainerDied","Data":"11dee7a3392af7881f9f6ccc686fb7c547484d4274a2c00005bc241e0d87afba"} Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.590251 4965 scope.go:117] "RemoveContainer" containerID="25bd374fe6559789b413b1c6550728ef13fad5dfda716db96d0dd7739fff1a8d" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.624380 4965 scope.go:117] "RemoveContainer" containerID="957d5304ffccad3a9264df3e5663d6c098d57ecddb1ad72013ce2fa91b99aac5" Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.660841 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-s6z25"] Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.679079 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-s6z25"] Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.683055 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xs2g6"] Nov 25 15:30:22 crc kubenswrapper[4965]: W1125 15:30:22.730183 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c0d36b9_701f_42e2_9a0d_03d0db171c59.slice/crio-58558cf2d16388d97b9d265906a57355fd5ce537f9078b84950721bbe9435499 WatchSource:0}: Error finding container 58558cf2d16388d97b9d265906a57355fd5ce537f9078b84950721bbe9435499: Status 404 returned error can't find the container with id 58558cf2d16388d97b9d265906a57355fd5ce537f9078b84950721bbe9435499 Nov 25 15:30:22 crc kubenswrapper[4965]: I1125 15:30:22.784527 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db43ec48-b8c9-4a63-960d-aaf762b4e184" path="/var/lib/kubelet/pods/db43ec48-b8c9-4a63-960d-aaf762b4e184/volumes" Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.260562 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.260626 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.260669 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.261370 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.261424 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" gracePeriod=600 Nov 25 15:30:23 crc kubenswrapper[4965]: E1125 15:30:23.382212 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.600669 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db88222c-47b1-4187-9794-50f067ffdc89","Type":"ContainerStarted","Data":"929f7c47f879cd52cc51ab1d7896c24ef0ca662a1b2cce491c43a6f68758b3e7"} Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.600797 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.604048 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xs2g6" event={"ID":"2c0d36b9-701f-42e2-9a0d-03d0db171c59","Type":"ContainerStarted","Data":"cb9a4beaf181edc3422d7d7eafc3531f7cecdb96b1dfb138abfaef7b5872ff7b"} Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.604098 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xs2g6" event={"ID":"2c0d36b9-701f-42e2-9a0d-03d0db171c59","Type":"ContainerStarted","Data":"58558cf2d16388d97b9d265906a57355fd5ce537f9078b84950721bbe9435499"} Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.607646 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" exitCode=0 Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.607681 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab"} Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.607709 4965 scope.go:117] "RemoveContainer" containerID="a0c4deae36fbd6888b83491cb53bd4ad9a4b3cad48a12bfa6331042ee58854cf" Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.608063 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:30:23 crc kubenswrapper[4965]: E1125 15:30:23.608261 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.643134 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.618308268 podStartE2EDuration="5.643104835s" podCreationTimestamp="2025-11-25 15:30:18 +0000 UTC" firstStartedPulling="2025-11-25 15:30:19.005076036 +0000 UTC m=+1563.972669782" lastFinishedPulling="2025-11-25 15:30:23.029872603 +0000 UTC m=+1567.997466349" observedRunningTime="2025-11-25 15:30:23.625290878 +0000 UTC m=+1568.592884634" watchObservedRunningTime="2025-11-25 15:30:23.643104835 +0000 UTC m=+1568.610698601" Nov 25 15:30:23 crc kubenswrapper[4965]: I1125 15:30:23.702409 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xs2g6" podStartSLOduration=2.702393192 podStartE2EDuration="2.702393192s" podCreationTimestamp="2025-11-25 15:30:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:30:23.696611045 +0000 UTC m=+1568.664204791" watchObservedRunningTime="2025-11-25 15:30:23.702393192 +0000 UTC m=+1568.669986938" Nov 25 15:30:27 crc kubenswrapper[4965]: I1125 15:30:27.650689 4965 generic.go:334] "Generic (PLEG): container finished" podID="2c0d36b9-701f-42e2-9a0d-03d0db171c59" containerID="cb9a4beaf181edc3422d7d7eafc3531f7cecdb96b1dfb138abfaef7b5872ff7b" exitCode=0 Nov 25 15:30:27 crc kubenswrapper[4965]: I1125 15:30:27.650774 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xs2g6" event={"ID":"2c0d36b9-701f-42e2-9a0d-03d0db171c59","Type":"ContainerDied","Data":"cb9a4beaf181edc3422d7d7eafc3531f7cecdb96b1dfb138abfaef7b5872ff7b"} Nov 25 15:30:27 crc kubenswrapper[4965]: I1125 15:30:27.922373 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 15:30:27 crc kubenswrapper[4965]: I1125 15:30:27.922422 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 15:30:28 crc kubenswrapper[4965]: I1125 15:30:28.934405 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.184:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 15:30:28 crc kubenswrapper[4965]: I1125 15:30:28.934670 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.184:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.044957 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.148435 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-scripts\") pod \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.148674 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-combined-ca-bundle\") pod \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.148870 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl5jm\" (UniqueName: \"kubernetes.io/projected/2c0d36b9-701f-42e2-9a0d-03d0db171c59-kube-api-access-hl5jm\") pod \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.148996 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-config-data\") pod \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\" (UID: \"2c0d36b9-701f-42e2-9a0d-03d0db171c59\") " Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.244432 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-scripts" (OuterVolumeSpecName: "scripts") pod "2c0d36b9-701f-42e2-9a0d-03d0db171c59" (UID: "2c0d36b9-701f-42e2-9a0d-03d0db171c59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.248316 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c0d36b9-701f-42e2-9a0d-03d0db171c59-kube-api-access-hl5jm" (OuterVolumeSpecName: "kube-api-access-hl5jm") pod "2c0d36b9-701f-42e2-9a0d-03d0db171c59" (UID: "2c0d36b9-701f-42e2-9a0d-03d0db171c59"). InnerVolumeSpecName "kube-api-access-hl5jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.265766 4965 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.265793 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl5jm\" (UniqueName: \"kubernetes.io/projected/2c0d36b9-701f-42e2-9a0d-03d0db171c59-kube-api-access-hl5jm\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.268133 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-config-data" (OuterVolumeSpecName: "config-data") pod "2c0d36b9-701f-42e2-9a0d-03d0db171c59" (UID: "2c0d36b9-701f-42e2-9a0d-03d0db171c59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.272977 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c0d36b9-701f-42e2-9a0d-03d0db171c59" (UID: "2c0d36b9-701f-42e2-9a0d-03d0db171c59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.410220 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.410252 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0d36b9-701f-42e2-9a0d-03d0db171c59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.696824 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xs2g6" event={"ID":"2c0d36b9-701f-42e2-9a0d-03d0db171c59","Type":"ContainerDied","Data":"58558cf2d16388d97b9d265906a57355fd5ce537f9078b84950721bbe9435499"} Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.696871 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58558cf2d16388d97b9d265906a57355fd5ce537f9078b84950721bbe9435499" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.696934 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xs2g6" Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.859981 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.860252 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerName="nova-api-log" containerID="cri-o://0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae" gracePeriod=30 Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.860375 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerName="nova-api-api" containerID="cri-o://b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409" gracePeriod=30 Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.883448 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.883885 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5598c7d5-0a06-43c1-8ed9-94e57832fdea" containerName="nova-scheduler-scheduler" containerID="cri-o://30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5" gracePeriod=30 Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.918114 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.918684 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerName="nova-metadata-log" containerID="cri-o://78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290" gracePeriod=30 Nov 25 15:30:29 crc kubenswrapper[4965]: I1125 15:30:29.918869 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerName="nova-metadata-metadata" containerID="cri-o://7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375" gracePeriod=30 Nov 25 15:30:30 crc kubenswrapper[4965]: I1125 15:30:30.708581 4965 generic.go:334] "Generic (PLEG): container finished" podID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerID="78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290" exitCode=143 Nov 25 15:30:30 crc kubenswrapper[4965]: I1125 15:30:30.708640 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db135a55-a79d-4cd7-8757-75d9fd2f17d7","Type":"ContainerDied","Data":"78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290"} Nov 25 15:30:30 crc kubenswrapper[4965]: I1125 15:30:30.711010 4965 generic.go:334] "Generic (PLEG): container finished" podID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerID="0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae" exitCode=143 Nov 25 15:30:30 crc kubenswrapper[4965]: I1125 15:30:30.711066 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebab39c1-5830-4a94-928c-15f90cd85d67","Type":"ContainerDied","Data":"0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae"} Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.472126 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.645752 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg8zj\" (UniqueName: \"kubernetes.io/projected/5598c7d5-0a06-43c1-8ed9-94e57832fdea-kube-api-access-tg8zj\") pod \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.645846 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-config-data\") pod \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.645875 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-combined-ca-bundle\") pod \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\" (UID: \"5598c7d5-0a06-43c1-8ed9-94e57832fdea\") " Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.656152 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5598c7d5-0a06-43c1-8ed9-94e57832fdea-kube-api-access-tg8zj" (OuterVolumeSpecName: "kube-api-access-tg8zj") pod "5598c7d5-0a06-43c1-8ed9-94e57832fdea" (UID: "5598c7d5-0a06-43c1-8ed9-94e57832fdea"). InnerVolumeSpecName "kube-api-access-tg8zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.671661 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5598c7d5-0a06-43c1-8ed9-94e57832fdea" (UID: "5598c7d5-0a06-43c1-8ed9-94e57832fdea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.679613 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-config-data" (OuterVolumeSpecName: "config-data") pod "5598c7d5-0a06-43c1-8ed9-94e57832fdea" (UID: "5598c7d5-0a06-43c1-8ed9-94e57832fdea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.721772 4965 generic.go:334] "Generic (PLEG): container finished" podID="5598c7d5-0a06-43c1-8ed9-94e57832fdea" containerID="30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5" exitCode=0 Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.721813 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5598c7d5-0a06-43c1-8ed9-94e57832fdea","Type":"ContainerDied","Data":"30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5"} Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.721838 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5598c7d5-0a06-43c1-8ed9-94e57832fdea","Type":"ContainerDied","Data":"98abdb9d05599d5f9cac74bc324da73ef76cc1a225978072de898dcc26c50be6"} Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.721853 4965 scope.go:117] "RemoveContainer" containerID="30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.721960 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.748735 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg8zj\" (UniqueName: \"kubernetes.io/projected/5598c7d5-0a06-43c1-8ed9-94e57832fdea-kube-api-access-tg8zj\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.748757 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.748766 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5598c7d5-0a06-43c1-8ed9-94e57832fdea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.765327 4965 scope.go:117] "RemoveContainer" containerID="30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5" Nov 25 15:30:31 crc kubenswrapper[4965]: E1125 15:30:31.774068 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5\": container with ID starting with 30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5 not found: ID does not exist" containerID="30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.774108 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5"} err="failed to get container status \"30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5\": rpc error: code = NotFound desc = could not find container \"30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5\": container with ID starting with 30509d623c8f9cc2a7eb438f87ff6b5d0351418dc981df38c84c03a209314fb5 not found: ID does not exist" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.788120 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.799858 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.810761 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:30:31 crc kubenswrapper[4965]: E1125 15:30:31.811206 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5598c7d5-0a06-43c1-8ed9-94e57832fdea" containerName="nova-scheduler-scheduler" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.811225 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="5598c7d5-0a06-43c1-8ed9-94e57832fdea" containerName="nova-scheduler-scheduler" Nov 25 15:30:31 crc kubenswrapper[4965]: E1125 15:30:31.811257 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0d36b9-701f-42e2-9a0d-03d0db171c59" containerName="nova-manage" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.811265 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0d36b9-701f-42e2-9a0d-03d0db171c59" containerName="nova-manage" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.811441 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0d36b9-701f-42e2-9a0d-03d0db171c59" containerName="nova-manage" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.811469 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="5598c7d5-0a06-43c1-8ed9-94e57832fdea" containerName="nova-scheduler-scheduler" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.812032 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.814164 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.838180 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.968166 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p846k\" (UniqueName: \"kubernetes.io/projected/bfe5968d-c0b6-4e22-802d-4d36f89347db-kube-api-access-p846k\") pod \"nova-scheduler-0\" (UID: \"bfe5968d-c0b6-4e22-802d-4d36f89347db\") " pod="openstack/nova-scheduler-0" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.968418 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe5968d-c0b6-4e22-802d-4d36f89347db-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bfe5968d-c0b6-4e22-802d-4d36f89347db\") " pod="openstack/nova-scheduler-0" Nov 25 15:30:31 crc kubenswrapper[4965]: I1125 15:30:31.968519 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe5968d-c0b6-4e22-802d-4d36f89347db-config-data\") pod \"nova-scheduler-0\" (UID: \"bfe5968d-c0b6-4e22-802d-4d36f89347db\") " pod="openstack/nova-scheduler-0" Nov 25 15:30:32 crc kubenswrapper[4965]: I1125 15:30:32.070287 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p846k\" (UniqueName: \"kubernetes.io/projected/bfe5968d-c0b6-4e22-802d-4d36f89347db-kube-api-access-p846k\") pod \"nova-scheduler-0\" (UID: \"bfe5968d-c0b6-4e22-802d-4d36f89347db\") " pod="openstack/nova-scheduler-0" Nov 25 15:30:32 crc kubenswrapper[4965]: I1125 15:30:32.070385 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe5968d-c0b6-4e22-802d-4d36f89347db-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bfe5968d-c0b6-4e22-802d-4d36f89347db\") " pod="openstack/nova-scheduler-0" Nov 25 15:30:32 crc kubenswrapper[4965]: I1125 15:30:32.070447 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe5968d-c0b6-4e22-802d-4d36f89347db-config-data\") pod \"nova-scheduler-0\" (UID: \"bfe5968d-c0b6-4e22-802d-4d36f89347db\") " pod="openstack/nova-scheduler-0" Nov 25 15:30:32 crc kubenswrapper[4965]: I1125 15:30:32.075823 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe5968d-c0b6-4e22-802d-4d36f89347db-config-data\") pod \"nova-scheduler-0\" (UID: \"bfe5968d-c0b6-4e22-802d-4d36f89347db\") " pod="openstack/nova-scheduler-0" Nov 25 15:30:32 crc kubenswrapper[4965]: I1125 15:30:32.075839 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe5968d-c0b6-4e22-802d-4d36f89347db-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bfe5968d-c0b6-4e22-802d-4d36f89347db\") " pod="openstack/nova-scheduler-0" Nov 25 15:30:32 crc kubenswrapper[4965]: I1125 15:30:32.091399 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p846k\" (UniqueName: \"kubernetes.io/projected/bfe5968d-c0b6-4e22-802d-4d36f89347db-kube-api-access-p846k\") pod \"nova-scheduler-0\" (UID: \"bfe5968d-c0b6-4e22-802d-4d36f89347db\") " pod="openstack/nova-scheduler-0" Nov 25 15:30:32 crc kubenswrapper[4965]: I1125 15:30:32.141042 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 15:30:32 crc kubenswrapper[4965]: I1125 15:30:32.607962 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 15:30:32 crc kubenswrapper[4965]: I1125 15:30:32.732801 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bfe5968d-c0b6-4e22-802d-4d36f89347db","Type":"ContainerStarted","Data":"cd89c13fb5947900e3e52af5ec15dc49b2e1e454123a3f51001e2f3a94e26f4a"} Nov 25 15:30:32 crc kubenswrapper[4965]: I1125 15:30:32.786636 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5598c7d5-0a06-43c1-8ed9-94e57832fdea" path="/var/lib/kubelet/pods/5598c7d5-0a06-43c1-8ed9-94e57832fdea/volumes" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.553251 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.696673 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-config-data\") pod \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.696820 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc7jr\" (UniqueName: \"kubernetes.io/projected/db135a55-a79d-4cd7-8757-75d9fd2f17d7-kube-api-access-lc7jr\") pod \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.696875 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-nova-metadata-tls-certs\") pod \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.696893 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db135a55-a79d-4cd7-8757-75d9fd2f17d7-logs\") pod \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.697007 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-combined-ca-bundle\") pod \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\" (UID: \"db135a55-a79d-4cd7-8757-75d9fd2f17d7\") " Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.698238 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db135a55-a79d-4cd7-8757-75d9fd2f17d7-logs" (OuterVolumeSpecName: "logs") pod "db135a55-a79d-4cd7-8757-75d9fd2f17d7" (UID: "db135a55-a79d-4cd7-8757-75d9fd2f17d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.718368 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db135a55-a79d-4cd7-8757-75d9fd2f17d7-kube-api-access-lc7jr" (OuterVolumeSpecName: "kube-api-access-lc7jr") pod "db135a55-a79d-4cd7-8757-75d9fd2f17d7" (UID: "db135a55-a79d-4cd7-8757-75d9fd2f17d7"). InnerVolumeSpecName "kube-api-access-lc7jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.725958 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db135a55-a79d-4cd7-8757-75d9fd2f17d7" (UID: "db135a55-a79d-4cd7-8757-75d9fd2f17d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.731007 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-config-data" (OuterVolumeSpecName: "config-data") pod "db135a55-a79d-4cd7-8757-75d9fd2f17d7" (UID: "db135a55-a79d-4cd7-8757-75d9fd2f17d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.746458 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bfe5968d-c0b6-4e22-802d-4d36f89347db","Type":"ContainerStarted","Data":"a9eebb715d9f3c934cb32c243a0a44f2c94b398be7a94eeca3059b106c4921dc"} Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.764339 4965 generic.go:334] "Generic (PLEG): container finished" podID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerID="7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375" exitCode=0 Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.764404 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db135a55-a79d-4cd7-8757-75d9fd2f17d7","Type":"ContainerDied","Data":"7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375"} Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.764445 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db135a55-a79d-4cd7-8757-75d9fd2f17d7","Type":"ContainerDied","Data":"1c267eb0c27e48898bc7baa910f646207f951be4b6e0af48c06383b119c3ac85"} Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.764467 4965 scope.go:117] "RemoveContainer" containerID="7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.764630 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.772162 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "db135a55-a79d-4cd7-8757-75d9fd2f17d7" (UID: "db135a55-a79d-4cd7-8757-75d9fd2f17d7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.772438 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:30:33 crc kubenswrapper[4965]: E1125 15:30:33.773098 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.793528 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7935059190000002 podStartE2EDuration="2.793505919s" podCreationTimestamp="2025-11-25 15:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:30:33.786109177 +0000 UTC m=+1578.753702943" watchObservedRunningTime="2025-11-25 15:30:33.793505919 +0000 UTC m=+1578.761099665" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.798831 4965 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.798891 4965 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db135a55-a79d-4cd7-8757-75d9fd2f17d7-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.798903 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.798911 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db135a55-a79d-4cd7-8757-75d9fd2f17d7-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.798919 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc7jr\" (UniqueName: \"kubernetes.io/projected/db135a55-a79d-4cd7-8757-75d9fd2f17d7-kube-api-access-lc7jr\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.858719 4965 scope.go:117] "RemoveContainer" containerID="78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.880285 4965 scope.go:117] "RemoveContainer" containerID="7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375" Nov 25 15:30:33 crc kubenswrapper[4965]: E1125 15:30:33.880654 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375\": container with ID starting with 7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375 not found: ID does not exist" containerID="7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.880697 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375"} err="failed to get container status \"7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375\": rpc error: code = NotFound desc = could not find container \"7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375\": container with ID starting with 7ee708ed58f2727dad7831d695619264010c055c620e4ace4e934ea7d9682375 not found: ID does not exist" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.880724 4965 scope.go:117] "RemoveContainer" containerID="78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290" Nov 25 15:30:33 crc kubenswrapper[4965]: E1125 15:30:33.881409 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290\": container with ID starting with 78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290 not found: ID does not exist" containerID="78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290" Nov 25 15:30:33 crc kubenswrapper[4965]: I1125 15:30:33.881443 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290"} err="failed to get container status \"78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290\": rpc error: code = NotFound desc = could not find container \"78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290\": container with ID starting with 78054c773143bcdc744896313364b462b3c8137aa63d8db331a80547312bc290 not found: ID does not exist" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.121006 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.146904 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.165103 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:30:34 crc kubenswrapper[4965]: E1125 15:30:34.165663 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerName="nova-metadata-log" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.165688 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerName="nova-metadata-log" Nov 25 15:30:34 crc kubenswrapper[4965]: E1125 15:30:34.165724 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerName="nova-metadata-metadata" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.165736 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerName="nova-metadata-metadata" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.166011 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerName="nova-metadata-metadata" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.166044 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" containerName="nova-metadata-log" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.167487 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.172141 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.172295 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.173088 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.307799 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/009897a7-5f6c-44a3-8076-262d3f946ae9-logs\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.308056 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009897a7-5f6c-44a3-8076-262d3f946ae9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.308084 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009897a7-5f6c-44a3-8076-262d3f946ae9-config-data\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.308111 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/009897a7-5f6c-44a3-8076-262d3f946ae9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.308167 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g95mk\" (UniqueName: \"kubernetes.io/projected/009897a7-5f6c-44a3-8076-262d3f946ae9-kube-api-access-g95mk\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.409734 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/009897a7-5f6c-44a3-8076-262d3f946ae9-logs\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.409788 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009897a7-5f6c-44a3-8076-262d3f946ae9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.409823 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009897a7-5f6c-44a3-8076-262d3f946ae9-config-data\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.409852 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/009897a7-5f6c-44a3-8076-262d3f946ae9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.409923 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g95mk\" (UniqueName: \"kubernetes.io/projected/009897a7-5f6c-44a3-8076-262d3f946ae9-kube-api-access-g95mk\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.410300 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/009897a7-5f6c-44a3-8076-262d3f946ae9-logs\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.413636 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009897a7-5f6c-44a3-8076-262d3f946ae9-config-data\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.413861 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009897a7-5f6c-44a3-8076-262d3f946ae9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.415257 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/009897a7-5f6c-44a3-8076-262d3f946ae9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.428561 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g95mk\" (UniqueName: \"kubernetes.io/projected/009897a7-5f6c-44a3-8076-262d3f946ae9-kube-api-access-g95mk\") pod \"nova-metadata-0\" (UID: \"009897a7-5f6c-44a3-8076-262d3f946ae9\") " pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.518886 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 15:30:34 crc kubenswrapper[4965]: I1125 15:30:34.785956 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db135a55-a79d-4cd7-8757-75d9fd2f17d7" path="/var/lib/kubelet/pods/db135a55-a79d-4cd7-8757-75d9fd2f17d7/volumes" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.026456 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.710475 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.790943 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"009897a7-5f6c-44a3-8076-262d3f946ae9","Type":"ContainerStarted","Data":"b1fa64fb9f638c81abde90b2735a24791f090219dae7b3a488219a2fd1b078af"} Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.792055 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"009897a7-5f6c-44a3-8076-262d3f946ae9","Type":"ContainerStarted","Data":"a9187af0c764f8eb0b88a5e6db3fb31d1e804b5ba390b3f61bd7c316e5640f16"} Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.792149 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"009897a7-5f6c-44a3-8076-262d3f946ae9","Type":"ContainerStarted","Data":"566323b45b46c15d6d06642e26483abf186df40e228136f972345b29e8c1e36a"} Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.795550 4965 generic.go:334] "Generic (PLEG): container finished" podID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerID="b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409" exitCode=0 Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.795754 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.795863 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebab39c1-5830-4a94-928c-15f90cd85d67","Type":"ContainerDied","Data":"b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409"} Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.795981 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebab39c1-5830-4a94-928c-15f90cd85d67","Type":"ContainerDied","Data":"e29e276543177c1899d3a1f9b547591e7579d6ff2e062bb5bcac24710c67cdaf"} Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.796082 4965 scope.go:117] "RemoveContainer" containerID="b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.815424 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.815404697 podStartE2EDuration="1.815404697s" podCreationTimestamp="2025-11-25 15:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:30:35.811713255 +0000 UTC m=+1580.779307001" watchObservedRunningTime="2025-11-25 15:30:35.815404697 +0000 UTC m=+1580.782998433" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.821136 4965 scope.go:117] "RemoveContainer" containerID="0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.836515 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-internal-tls-certs\") pod \"ebab39c1-5830-4a94-928c-15f90cd85d67\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.836641 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-combined-ca-bundle\") pod \"ebab39c1-5830-4a94-928c-15f90cd85d67\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.836698 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzplw\" (UniqueName: \"kubernetes.io/projected/ebab39c1-5830-4a94-928c-15f90cd85d67-kube-api-access-qzplw\") pod \"ebab39c1-5830-4a94-928c-15f90cd85d67\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.837093 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-public-tls-certs\") pod \"ebab39c1-5830-4a94-928c-15f90cd85d67\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.837145 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebab39c1-5830-4a94-928c-15f90cd85d67-logs\") pod \"ebab39c1-5830-4a94-928c-15f90cd85d67\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.837177 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-config-data\") pod \"ebab39c1-5830-4a94-928c-15f90cd85d67\" (UID: \"ebab39c1-5830-4a94-928c-15f90cd85d67\") " Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.837464 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebab39c1-5830-4a94-928c-15f90cd85d67-logs" (OuterVolumeSpecName: "logs") pod "ebab39c1-5830-4a94-928c-15f90cd85d67" (UID: "ebab39c1-5830-4a94-928c-15f90cd85d67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.837571 4965 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebab39c1-5830-4a94-928c-15f90cd85d67-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.839595 4965 scope.go:117] "RemoveContainer" containerID="b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409" Nov 25 15:30:35 crc kubenswrapper[4965]: E1125 15:30:35.841596 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409\": container with ID starting with b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409 not found: ID does not exist" containerID="b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.841652 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409"} err="failed to get container status \"b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409\": rpc error: code = NotFound desc = could not find container \"b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409\": container with ID starting with b5013f8c008fb3470c88a8c9f659dc43237dc527eafde865a0c27b02871e0409 not found: ID does not exist" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.841685 4965 scope.go:117] "RemoveContainer" containerID="0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae" Nov 25 15:30:35 crc kubenswrapper[4965]: E1125 15:30:35.842081 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae\": container with ID starting with 0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae not found: ID does not exist" containerID="0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.842124 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae"} err="failed to get container status \"0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae\": rpc error: code = NotFound desc = could not find container \"0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae\": container with ID starting with 0d4c2ab83de2aa2d8b5ae8903e39abb7eabaef133a32e3675fe2907951298aae not found: ID does not exist" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.842160 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebab39c1-5830-4a94-928c-15f90cd85d67-kube-api-access-qzplw" (OuterVolumeSpecName: "kube-api-access-qzplw") pod "ebab39c1-5830-4a94-928c-15f90cd85d67" (UID: "ebab39c1-5830-4a94-928c-15f90cd85d67"). InnerVolumeSpecName "kube-api-access-qzplw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.866691 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-config-data" (OuterVolumeSpecName: "config-data") pod "ebab39c1-5830-4a94-928c-15f90cd85d67" (UID: "ebab39c1-5830-4a94-928c-15f90cd85d67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.867024 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebab39c1-5830-4a94-928c-15f90cd85d67" (UID: "ebab39c1-5830-4a94-928c-15f90cd85d67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.883452 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ebab39c1-5830-4a94-928c-15f90cd85d67" (UID: "ebab39c1-5830-4a94-928c-15f90cd85d67"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.891957 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ebab39c1-5830-4a94-928c-15f90cd85d67" (UID: "ebab39c1-5830-4a94-928c-15f90cd85d67"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.939404 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.939458 4965 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.939474 4965 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.939490 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzplw\" (UniqueName: \"kubernetes.io/projected/ebab39c1-5830-4a94-928c-15f90cd85d67-kube-api-access-qzplw\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:35 crc kubenswrapper[4965]: I1125 15:30:35.939502 4965 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab39c1-5830-4a94-928c-15f90cd85d67-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.134610 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.143950 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.160016 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:36 crc kubenswrapper[4965]: E1125 15:30:36.160330 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerName="nova-api-log" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.160342 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerName="nova-api-log" Nov 25 15:30:36 crc kubenswrapper[4965]: E1125 15:30:36.160372 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerName="nova-api-api" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.160378 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerName="nova-api-api" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.171223 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerName="nova-api-log" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.171273 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" containerName="nova-api-api" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.172360 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.174424 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.178566 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.178744 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.183579 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.344276 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-public-tls-certs\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.344330 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzkwg\" (UniqueName: \"kubernetes.io/projected/3e161c28-2cef-473a-bc7f-88e13dbb55c3-kube-api-access-xzkwg\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.344769 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-config-data\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.345047 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.345130 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e161c28-2cef-473a-bc7f-88e13dbb55c3-logs\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.345216 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.447028 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-config-data\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.447191 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.447229 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e161c28-2cef-473a-bc7f-88e13dbb55c3-logs\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.447276 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.447400 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-public-tls-certs\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.447441 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzkwg\" (UniqueName: \"kubernetes.io/projected/3e161c28-2cef-473a-bc7f-88e13dbb55c3-kube-api-access-xzkwg\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.448740 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e161c28-2cef-473a-bc7f-88e13dbb55c3-logs\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.454448 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.456747 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.457906 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-config-data\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.463594 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e161c28-2cef-473a-bc7f-88e13dbb55c3-public-tls-certs\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.483167 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzkwg\" (UniqueName: \"kubernetes.io/projected/3e161c28-2cef-473a-bc7f-88e13dbb55c3-kube-api-access-xzkwg\") pod \"nova-api-0\" (UID: \"3e161c28-2cef-473a-bc7f-88e13dbb55c3\") " pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.538922 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 15:30:36 crc kubenswrapper[4965]: I1125 15:30:36.784456 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebab39c1-5830-4a94-928c-15f90cd85d67" path="/var/lib/kubelet/pods/ebab39c1-5830-4a94-928c-15f90cd85d67/volumes" Nov 25 15:30:37 crc kubenswrapper[4965]: I1125 15:30:37.012938 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 15:30:37 crc kubenswrapper[4965]: I1125 15:30:37.142014 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 15:30:37 crc kubenswrapper[4965]: I1125 15:30:37.816321 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e161c28-2cef-473a-bc7f-88e13dbb55c3","Type":"ContainerStarted","Data":"9e984fcd08938cc23317586ed969e1f58e1971459eb1c7ddcfcdb08bfaca48f1"} Nov 25 15:30:37 crc kubenswrapper[4965]: I1125 15:30:37.816791 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e161c28-2cef-473a-bc7f-88e13dbb55c3","Type":"ContainerStarted","Data":"4f7abb4d8e85df00b9e5346e261fb8c43ab88086a58d1c123083643c6611668e"} Nov 25 15:30:37 crc kubenswrapper[4965]: I1125 15:30:37.816834 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e161c28-2cef-473a-bc7f-88e13dbb55c3","Type":"ContainerStarted","Data":"5dfcc81027ebabe4c0fecab90eae5f36a13a0b7ba0d7eb217e4a9ac438a6627d"} Nov 25 15:30:37 crc kubenswrapper[4965]: I1125 15:30:37.848501 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.848481829 podStartE2EDuration="1.848481829s" podCreationTimestamp="2025-11-25 15:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:30:37.836746909 +0000 UTC m=+1582.804340675" watchObservedRunningTime="2025-11-25 15:30:37.848481829 +0000 UTC m=+1582.816075575" Nov 25 15:30:39 crc kubenswrapper[4965]: I1125 15:30:39.519450 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 15:30:39 crc kubenswrapper[4965]: I1125 15:30:39.519806 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 15:30:42 crc kubenswrapper[4965]: I1125 15:30:42.142377 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 15:30:42 crc kubenswrapper[4965]: I1125 15:30:42.172442 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 15:30:42 crc kubenswrapper[4965]: I1125 15:30:42.885443 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 15:30:44 crc kubenswrapper[4965]: I1125 15:30:44.519109 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 15:30:44 crc kubenswrapper[4965]: I1125 15:30:44.519204 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 15:30:44 crc kubenswrapper[4965]: I1125 15:30:44.775825 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:30:44 crc kubenswrapper[4965]: E1125 15:30:44.776682 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:30:45 crc kubenswrapper[4965]: I1125 15:30:45.534225 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="009897a7-5f6c-44a3-8076-262d3f946ae9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 15:30:45 crc kubenswrapper[4965]: I1125 15:30:45.534236 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="009897a7-5f6c-44a3-8076-262d3f946ae9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 15:30:46 crc kubenswrapper[4965]: I1125 15:30:46.540187 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 15:30:46 crc kubenswrapper[4965]: I1125 15:30:46.540484 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 15:30:47 crc kubenswrapper[4965]: I1125 15:30:47.556545 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3e161c28-2cef-473a-bc7f-88e13dbb55c3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.189:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 15:30:47 crc kubenswrapper[4965]: I1125 15:30:47.556564 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3e161c28-2cef-473a-bc7f-88e13dbb55c3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.189:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 15:30:48 crc kubenswrapper[4965]: I1125 15:30:48.514949 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 15:30:54 crc kubenswrapper[4965]: I1125 15:30:54.525390 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 15:30:54 crc kubenswrapper[4965]: I1125 15:30:54.526520 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 15:30:54 crc kubenswrapper[4965]: I1125 15:30:54.532279 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 15:30:54 crc kubenswrapper[4965]: I1125 15:30:54.971566 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 15:30:56 crc kubenswrapper[4965]: I1125 15:30:56.553090 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 15:30:56 crc kubenswrapper[4965]: I1125 15:30:56.554755 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 15:30:56 crc kubenswrapper[4965]: I1125 15:30:56.555216 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 15:30:56 crc kubenswrapper[4965]: I1125 15:30:56.564012 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 15:30:56 crc kubenswrapper[4965]: I1125 15:30:56.982997 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 15:30:56 crc kubenswrapper[4965]: I1125 15:30:56.992767 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.044592 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rrj2k"] Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.048041 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.101378 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrj2k"] Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.146332 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-utilities\") pod \"redhat-marketplace-rrj2k\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.146447 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26zqx\" (UniqueName: \"kubernetes.io/projected/196c0cad-e979-4c26-8cd8-92b42ba0a123-kube-api-access-26zqx\") pod \"redhat-marketplace-rrj2k\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.146501 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-catalog-content\") pod \"redhat-marketplace-rrj2k\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.247939 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26zqx\" (UniqueName: \"kubernetes.io/projected/196c0cad-e979-4c26-8cd8-92b42ba0a123-kube-api-access-26zqx\") pod \"redhat-marketplace-rrj2k\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.248003 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-catalog-content\") pod \"redhat-marketplace-rrj2k\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.248081 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-utilities\") pod \"redhat-marketplace-rrj2k\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.248560 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-utilities\") pod \"redhat-marketplace-rrj2k\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.248738 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-catalog-content\") pod \"redhat-marketplace-rrj2k\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.268718 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26zqx\" (UniqueName: \"kubernetes.io/projected/196c0cad-e979-4c26-8cd8-92b42ba0a123-kube-api-access-26zqx\") pod \"redhat-marketplace-rrj2k\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.390188 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:30:57 crc kubenswrapper[4965]: I1125 15:30:57.920861 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrj2k"] Nov 25 15:30:58 crc kubenswrapper[4965]: I1125 15:30:58.000667 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrj2k" event={"ID":"196c0cad-e979-4c26-8cd8-92b42ba0a123","Type":"ContainerStarted","Data":"9806783c5b6bf9bd9d87db8d5963e7e9d0f4e37e999a51567c0e9943d0971c9f"} Nov 25 15:30:58 crc kubenswrapper[4965]: I1125 15:30:58.772093 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:30:58 crc kubenswrapper[4965]: E1125 15:30:58.773012 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:30:59 crc kubenswrapper[4965]: I1125 15:30:59.013338 4965 generic.go:334] "Generic (PLEG): container finished" podID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerID="4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a" exitCode=0 Nov 25 15:30:59 crc kubenswrapper[4965]: I1125 15:30:59.013883 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrj2k" event={"ID":"196c0cad-e979-4c26-8cd8-92b42ba0a123","Type":"ContainerDied","Data":"4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a"} Nov 25 15:31:00 crc kubenswrapper[4965]: I1125 15:31:00.027448 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrj2k" event={"ID":"196c0cad-e979-4c26-8cd8-92b42ba0a123","Type":"ContainerStarted","Data":"fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154"} Nov 25 15:31:01 crc kubenswrapper[4965]: I1125 15:31:01.041588 4965 generic.go:334] "Generic (PLEG): container finished" podID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerID="fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154" exitCode=0 Nov 25 15:31:01 crc kubenswrapper[4965]: I1125 15:31:01.041659 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrj2k" event={"ID":"196c0cad-e979-4c26-8cd8-92b42ba0a123","Type":"ContainerDied","Data":"fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154"} Nov 25 15:31:02 crc kubenswrapper[4965]: I1125 15:31:02.053768 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrj2k" event={"ID":"196c0cad-e979-4c26-8cd8-92b42ba0a123","Type":"ContainerStarted","Data":"637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a"} Nov 25 15:31:02 crc kubenswrapper[4965]: I1125 15:31:02.085428 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rrj2k" podStartSLOduration=2.660271843 podStartE2EDuration="5.085408673s" podCreationTimestamp="2025-11-25 15:30:57 +0000 UTC" firstStartedPulling="2025-11-25 15:30:59.015745807 +0000 UTC m=+1603.983339573" lastFinishedPulling="2025-11-25 15:31:01.440882657 +0000 UTC m=+1606.408476403" observedRunningTime="2025-11-25 15:31:02.075784201 +0000 UTC m=+1607.043377947" watchObservedRunningTime="2025-11-25 15:31:02.085408673 +0000 UTC m=+1607.053002419" Nov 25 15:31:06 crc kubenswrapper[4965]: I1125 15:31:06.584509 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 15:31:07 crc kubenswrapper[4965]: I1125 15:31:07.391427 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:31:07 crc kubenswrapper[4965]: I1125 15:31:07.391466 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:31:07 crc kubenswrapper[4965]: I1125 15:31:07.454589 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:31:07 crc kubenswrapper[4965]: I1125 15:31:07.486636 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 15:31:08 crc kubenswrapper[4965]: I1125 15:31:08.190208 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:31:08 crc kubenswrapper[4965]: I1125 15:31:08.264951 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrj2k"] Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.136226 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rrj2k" podUID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerName="registry-server" containerID="cri-o://637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a" gracePeriod=2 Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.667057 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.778897 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:31:10 crc kubenswrapper[4965]: E1125 15:31:10.779294 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.808893 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-catalog-content\") pod \"196c0cad-e979-4c26-8cd8-92b42ba0a123\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.809165 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-utilities\") pod \"196c0cad-e979-4c26-8cd8-92b42ba0a123\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.809209 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26zqx\" (UniqueName: \"kubernetes.io/projected/196c0cad-e979-4c26-8cd8-92b42ba0a123-kube-api-access-26zqx\") pod \"196c0cad-e979-4c26-8cd8-92b42ba0a123\" (UID: \"196c0cad-e979-4c26-8cd8-92b42ba0a123\") " Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.809911 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-utilities" (OuterVolumeSpecName: "utilities") pod "196c0cad-e979-4c26-8cd8-92b42ba0a123" (UID: "196c0cad-e979-4c26-8cd8-92b42ba0a123"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.810495 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.822800 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "196c0cad-e979-4c26-8cd8-92b42ba0a123" (UID: "196c0cad-e979-4c26-8cd8-92b42ba0a123"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.828760 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/196c0cad-e979-4c26-8cd8-92b42ba0a123-kube-api-access-26zqx" (OuterVolumeSpecName: "kube-api-access-26zqx") pod "196c0cad-e979-4c26-8cd8-92b42ba0a123" (UID: "196c0cad-e979-4c26-8cd8-92b42ba0a123"). InnerVolumeSpecName "kube-api-access-26zqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.912750 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196c0cad-e979-4c26-8cd8-92b42ba0a123-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:10 crc kubenswrapper[4965]: I1125 15:31:10.912786 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26zqx\" (UniqueName: \"kubernetes.io/projected/196c0cad-e979-4c26-8cd8-92b42ba0a123-kube-api-access-26zqx\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.145770 4965 generic.go:334] "Generic (PLEG): container finished" podID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerID="637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a" exitCode=0 Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.145821 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrj2k" event={"ID":"196c0cad-e979-4c26-8cd8-92b42ba0a123","Type":"ContainerDied","Data":"637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a"} Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.145842 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrj2k" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.145862 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrj2k" event={"ID":"196c0cad-e979-4c26-8cd8-92b42ba0a123","Type":"ContainerDied","Data":"9806783c5b6bf9bd9d87db8d5963e7e9d0f4e37e999a51567c0e9943d0971c9f"} Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.145886 4965 scope.go:117] "RemoveContainer" containerID="637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.166374 4965 scope.go:117] "RemoveContainer" containerID="fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.200936 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrj2k"] Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.209023 4965 scope.go:117] "RemoveContainer" containerID="4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.232195 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrj2k"] Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.255167 4965 scope.go:117] "RemoveContainer" containerID="637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a" Nov 25 15:31:11 crc kubenswrapper[4965]: E1125 15:31:11.264515 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a\": container with ID starting with 637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a not found: ID does not exist" containerID="637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.264575 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a"} err="failed to get container status \"637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a\": rpc error: code = NotFound desc = could not find container \"637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a\": container with ID starting with 637345201aea311991388e7a2c29840d81ab56142f3e33c9d89908419e54351a not found: ID does not exist" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.264614 4965 scope.go:117] "RemoveContainer" containerID="fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154" Nov 25 15:31:11 crc kubenswrapper[4965]: E1125 15:31:11.269747 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154\": container with ID starting with fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154 not found: ID does not exist" containerID="fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.269884 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154"} err="failed to get container status \"fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154\": rpc error: code = NotFound desc = could not find container \"fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154\": container with ID starting with fe16bf92fe614369c5376769a1a0d06098ccff8ee602f19166bd669e51612154 not found: ID does not exist" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.270009 4965 scope.go:117] "RemoveContainer" containerID="4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a" Nov 25 15:31:11 crc kubenswrapper[4965]: E1125 15:31:11.270406 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a\": container with ID starting with 4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a not found: ID does not exist" containerID="4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.270506 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a"} err="failed to get container status \"4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a\": rpc error: code = NotFound desc = could not find container \"4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a\": container with ID starting with 4cbaba91484fb292344b4256fe6a2a2b490d972c2e6af96b45e595339202705a not found: ID does not exist" Nov 25 15:31:11 crc kubenswrapper[4965]: I1125 15:31:11.723871 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" containerName="rabbitmq" containerID="cri-o://485f97d15b5d763accc387f4ca2c06c6a93725f90d652e5d986b2b9ef4af1f74" gracePeriod=604795 Nov 25 15:31:12 crc kubenswrapper[4965]: I1125 15:31:12.800662 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="196c0cad-e979-4c26-8cd8-92b42ba0a123" path="/var/lib/kubelet/pods/196c0cad-e979-4c26-8cd8-92b42ba0a123/volumes" Nov 25 15:31:13 crc kubenswrapper[4965]: I1125 15:31:13.109839 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerName="rabbitmq" containerID="cri-o://2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6" gracePeriod=604795 Nov 25 15:31:14 crc kubenswrapper[4965]: I1125 15:31:14.913390 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vncbk"] Nov 25 15:31:14 crc kubenswrapper[4965]: E1125 15:31:14.914050 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerName="extract-utilities" Nov 25 15:31:14 crc kubenswrapper[4965]: I1125 15:31:14.914062 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerName="extract-utilities" Nov 25 15:31:14 crc kubenswrapper[4965]: E1125 15:31:14.914076 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerName="registry-server" Nov 25 15:31:14 crc kubenswrapper[4965]: I1125 15:31:14.914082 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerName="registry-server" Nov 25 15:31:14 crc kubenswrapper[4965]: E1125 15:31:14.914094 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerName="extract-content" Nov 25 15:31:14 crc kubenswrapper[4965]: I1125 15:31:14.914100 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerName="extract-content" Nov 25 15:31:14 crc kubenswrapper[4965]: I1125 15:31:14.914262 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="196c0cad-e979-4c26-8cd8-92b42ba0a123" containerName="registry-server" Nov 25 15:31:14 crc kubenswrapper[4965]: I1125 15:31:14.915532 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:14 crc kubenswrapper[4965]: I1125 15:31:14.938461 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vncbk"] Nov 25 15:31:14 crc kubenswrapper[4965]: I1125 15:31:14.988152 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-catalog-content\") pod \"certified-operators-vncbk\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:14 crc kubenswrapper[4965]: I1125 15:31:14.988220 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-utilities\") pod \"certified-operators-vncbk\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:14 crc kubenswrapper[4965]: I1125 15:31:14.988337 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hnx\" (UniqueName: \"kubernetes.io/projected/f2de4912-4242-4b1e-acb5-d5e165f881ce-kube-api-access-m2hnx\") pod \"certified-operators-vncbk\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:15 crc kubenswrapper[4965]: I1125 15:31:15.089619 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-catalog-content\") pod \"certified-operators-vncbk\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:15 crc kubenswrapper[4965]: I1125 15:31:15.089659 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-utilities\") pod \"certified-operators-vncbk\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:15 crc kubenswrapper[4965]: I1125 15:31:15.089723 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hnx\" (UniqueName: \"kubernetes.io/projected/f2de4912-4242-4b1e-acb5-d5e165f881ce-kube-api-access-m2hnx\") pod \"certified-operators-vncbk\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:15 crc kubenswrapper[4965]: I1125 15:31:15.090225 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-utilities\") pod \"certified-operators-vncbk\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:15 crc kubenswrapper[4965]: I1125 15:31:15.090414 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-catalog-content\") pod \"certified-operators-vncbk\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:15 crc kubenswrapper[4965]: I1125 15:31:15.117055 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hnx\" (UniqueName: \"kubernetes.io/projected/f2de4912-4242-4b1e-acb5-d5e165f881ce-kube-api-access-m2hnx\") pod \"certified-operators-vncbk\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:15 crc kubenswrapper[4965]: I1125 15:31:15.235487 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:15 crc kubenswrapper[4965]: I1125 15:31:15.746170 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vncbk"] Nov 25 15:31:16 crc kubenswrapper[4965]: I1125 15:31:16.210294 4965 generic.go:334] "Generic (PLEG): container finished" podID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerID="8ba0a1d4ba6eac7b4c3e01b26df255a7dd4bb81c20d1238c46a91ca2c5694cf9" exitCode=0 Nov 25 15:31:16 crc kubenswrapper[4965]: I1125 15:31:16.210492 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vncbk" event={"ID":"f2de4912-4242-4b1e-acb5-d5e165f881ce","Type":"ContainerDied","Data":"8ba0a1d4ba6eac7b4c3e01b26df255a7dd4bb81c20d1238c46a91ca2c5694cf9"} Nov 25 15:31:16 crc kubenswrapper[4965]: I1125 15:31:16.210518 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vncbk" event={"ID":"f2de4912-4242-4b1e-acb5-d5e165f881ce","Type":"ContainerStarted","Data":"c37485e8549aecb8c9addbeb88eb8431e748d3abf0423323638d7a5762fdf15f"} Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.227204 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vncbk" event={"ID":"f2de4912-4242-4b1e-acb5-d5e165f881ce","Type":"ContainerStarted","Data":"7dda8b4f136bec9ed6be795a5bfcdce3872b0f1a71a5bf0c9a104abd9c165dfa"} Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.231411 4965 generic.go:334] "Generic (PLEG): container finished" podID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" containerID="485f97d15b5d763accc387f4ca2c06c6a93725f90d652e5d986b2b9ef4af1f74" exitCode=0 Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.231446 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"739d03f5-20b2-4c12-9f3e-fbe795ec890d","Type":"ContainerDied","Data":"485f97d15b5d763accc387f4ca2c06c6a93725f90d652e5d986b2b9ef4af1f74"} Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.472492 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.555749 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-erlang-cookie\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.555856 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-plugins\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.555893 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-config-data\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.555979 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/739d03f5-20b2-4c12-9f3e-fbe795ec890d-erlang-cookie-secret\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.556039 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-tls\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.556090 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22smk\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-kube-api-access-22smk\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.556104 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-server-conf\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.556134 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-confd\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.556164 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-plugins-conf\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.556197 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.556218 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/739d03f5-20b2-4c12-9f3e-fbe795ec890d-pod-info\") pod \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\" (UID: \"739d03f5-20b2-4c12-9f3e-fbe795ec890d\") " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.556279 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.556581 4965 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.558293 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.558566 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.566452 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-kube-api-access-22smk" (OuterVolumeSpecName: "kube-api-access-22smk") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "kube-api-access-22smk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.577380 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.580505 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739d03f5-20b2-4c12-9f3e-fbe795ec890d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.580585 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/739d03f5-20b2-4c12-9f3e-fbe795ec890d-pod-info" (OuterVolumeSpecName: "pod-info") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.598211 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.627394 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-server-conf" (OuterVolumeSpecName: "server-conf") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.636326 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-config-data" (OuterVolumeSpecName: "config-data") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.658979 4965 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.659009 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.659018 4965 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/739d03f5-20b2-4c12-9f3e-fbe795ec890d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.659028 4965 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.659037 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22smk\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-kube-api-access-22smk\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.659049 4965 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-server-conf\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.659057 4965 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/739d03f5-20b2-4c12-9f3e-fbe795ec890d-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.659087 4965 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.659096 4965 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/739d03f5-20b2-4c12-9f3e-fbe795ec890d-pod-info\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.725537 4965 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.760952 4965 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.761187 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "739d03f5-20b2-4c12-9f3e-fbe795ec890d" (UID: "739d03f5-20b2-4c12-9f3e-fbe795ec890d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:31:18 crc kubenswrapper[4965]: I1125 15:31:18.862342 4965 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/739d03f5-20b2-4c12-9f3e-fbe795ec890d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.240835 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"739d03f5-20b2-4c12-9f3e-fbe795ec890d","Type":"ContainerDied","Data":"3a341c587c33d219d03462d7a93fce9cc17d17e674be7d4e971b45b8ab0aa777"} Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.240901 4965 scope.go:117] "RemoveContainer" containerID="485f97d15b5d763accc387f4ca2c06c6a93725f90d652e5d986b2b9ef4af1f74" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.240929 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.251739 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.267520 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.267676 4965 scope.go:117] "RemoveContainer" containerID="cf89e00c635745f9ff3cd4216c52d7f9cf91427b5734375cdbbf6a9cd925aaa5" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.275043 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.314550 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 15:31:19 crc kubenswrapper[4965]: E1125 15:31:19.314916 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" containerName="rabbitmq" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.314931 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" containerName="rabbitmq" Nov 25 15:31:19 crc kubenswrapper[4965]: E1125 15:31:19.314951 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" containerName="setup-container" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.314958 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" containerName="setup-container" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.315134 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" containerName="rabbitmq" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.316046 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.319691 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.320023 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.320184 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.320778 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.322494 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.334169 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gsqfl" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.345225 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.361656 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.370469 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.370651 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.370742 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.370824 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.370899 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.371000 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.371087 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.371194 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.371272 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqvb7\" (UniqueName: \"kubernetes.io/projected/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-kube-api-access-cqvb7\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.371362 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-config-data\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.379697 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481367 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481414 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481446 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481470 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481490 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481544 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481586 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481629 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481652 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqvb7\" (UniqueName: \"kubernetes.io/projected/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-kube-api-access-cqvb7\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481694 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-config-data\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.481724 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.482679 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.482763 4965 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.483341 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.483615 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-config-data\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.484098 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.485450 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.488037 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.488043 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.490001 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.490518 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.503870 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqvb7\" (UniqueName: \"kubernetes.io/projected/d5400ed8-9880-47b3-b8e7-5de35a2c7e00-kube-api-access-cqvb7\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.539525 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"d5400ed8-9880-47b3-b8e7-5de35a2c7e00\") " pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.675234 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.770990 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.786752 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-server-conf\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.786806 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-config-data\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.786835 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-plugins\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.787181 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.787217 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hcd4\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-kube-api-access-2hcd4\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.787285 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-confd\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.787311 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.787354 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-erlang-cookie-secret\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.787481 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-pod-info\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.787527 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-plugins-conf\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.787552 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-erlang-cookie\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.787571 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-tls\") pod \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\" (UID: \"67d0186d-7eca-48a0-9cc8-56ce4d1caa38\") " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.788445 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.788566 4965 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.788586 4965 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.789033 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.811105 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.811226 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-kube-api-access-2hcd4" (OuterVolumeSpecName: "kube-api-access-2hcd4") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "kube-api-access-2hcd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.811275 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.811326 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.819605 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-pod-info" (OuterVolumeSpecName: "pod-info") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.869614 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-config-data" (OuterVolumeSpecName: "config-data") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.877488 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-server-conf" (OuterVolumeSpecName: "server-conf") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.891167 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hcd4\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-kube-api-access-2hcd4\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.891221 4965 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.891236 4965 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.891248 4965 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-pod-info\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.891261 4965 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.891274 4965 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.891288 4965 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-server-conf\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.891299 4965 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.916838 4965 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.972168 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "67d0186d-7eca-48a0-9cc8-56ce4d1caa38" (UID: "67d0186d-7eca-48a0-9cc8-56ce4d1caa38"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.992600 4965 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67d0186d-7eca-48a0-9cc8-56ce4d1caa38-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:19 crc kubenswrapper[4965]: I1125 15:31:19.992634 4965 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.165778 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 15:31:20 crc kubenswrapper[4965]: W1125 15:31:20.171687 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5400ed8_9880_47b3_b8e7_5de35a2c7e00.slice/crio-213fa0772777ad5f30f550d2d29a04ee26bde88ed7ae71994d3f0020f53071e2 WatchSource:0}: Error finding container 213fa0772777ad5f30f550d2d29a04ee26bde88ed7ae71994d3f0020f53071e2: Status 404 returned error can't find the container with id 213fa0772777ad5f30f550d2d29a04ee26bde88ed7ae71994d3f0020f53071e2 Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.259711 4965 generic.go:334] "Generic (PLEG): container finished" podID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerID="2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6" exitCode=0 Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.259765 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"67d0186d-7eca-48a0-9cc8-56ce4d1caa38","Type":"ContainerDied","Data":"2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6"} Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.259790 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"67d0186d-7eca-48a0-9cc8-56ce4d1caa38","Type":"ContainerDied","Data":"b385bcb1fb02059d7fe7559d4667b942eff49f4849a5c74c43f69d9ecefe4024"} Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.259806 4965 scope.go:117] "RemoveContainer" containerID="2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.259933 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.266621 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5400ed8-9880-47b3-b8e7-5de35a2c7e00","Type":"ContainerStarted","Data":"213fa0772777ad5f30f550d2d29a04ee26bde88ed7ae71994d3f0020f53071e2"} Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.295610 4965 scope.go:117] "RemoveContainer" containerID="757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.317774 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.326438 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.334034 4965 scope.go:117] "RemoveContainer" containerID="2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6" Nov 25 15:31:20 crc kubenswrapper[4965]: E1125 15:31:20.335513 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6\": container with ID starting with 2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6 not found: ID does not exist" containerID="2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.335555 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6"} err="failed to get container status \"2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6\": rpc error: code = NotFound desc = could not find container \"2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6\": container with ID starting with 2473a652b113e0bd42202b813521613eaac7637881314b626e7ccb38040ff2a6 not found: ID does not exist" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.335587 4965 scope.go:117] "RemoveContainer" containerID="757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d" Nov 25 15:31:20 crc kubenswrapper[4965]: E1125 15:31:20.336243 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d\": container with ID starting with 757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d not found: ID does not exist" containerID="757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.336296 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d"} err="failed to get container status \"757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d\": rpc error: code = NotFound desc = could not find container \"757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d\": container with ID starting with 757cc4c018da141ccc2b52b67bd07970c0115c5b2c817f771cdc0f3b6c62d19d not found: ID does not exist" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.345147 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 15:31:20 crc kubenswrapper[4965]: E1125 15:31:20.346131 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerName="setup-container" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.346228 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerName="setup-container" Nov 25 15:31:20 crc kubenswrapper[4965]: E1125 15:31:20.346312 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerName="rabbitmq" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.346414 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerName="rabbitmq" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.346710 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" containerName="rabbitmq" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.348069 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.350519 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t5nv5" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.354483 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.354653 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.354813 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.354912 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.355068 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.356186 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.370940 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511264 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9a811059-77da-436c-95e6-fddf5baa649c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511326 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hclph\" (UniqueName: \"kubernetes.io/projected/9a811059-77da-436c-95e6-fddf5baa649c-kube-api-access-hclph\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511372 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511391 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511419 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511451 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9a811059-77da-436c-95e6-fddf5baa649c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511501 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511699 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511815 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a811059-77da-436c-95e6-fddf5baa649c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511856 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9a811059-77da-436c-95e6-fddf5baa649c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.511906 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9a811059-77da-436c-95e6-fddf5baa649c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613421 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613511 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613552 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613592 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9a811059-77da-436c-95e6-fddf5baa649c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613666 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613696 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613715 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a811059-77da-436c-95e6-fddf5baa649c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613732 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9a811059-77da-436c-95e6-fddf5baa649c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613752 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9a811059-77da-436c-95e6-fddf5baa649c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613832 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9a811059-77da-436c-95e6-fddf5baa649c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613877 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hclph\" (UniqueName: \"kubernetes.io/projected/9a811059-77da-436c-95e6-fddf5baa649c-kube-api-access-hclph\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.613915 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.614172 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.614654 4965 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.615330 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a811059-77da-436c-95e6-fddf5baa649c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.615532 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9a811059-77da-436c-95e6-fddf5baa649c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.615524 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9a811059-77da-436c-95e6-fddf5baa649c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.619346 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.620259 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9a811059-77da-436c-95e6-fddf5baa649c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.621885 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9a811059-77da-436c-95e6-fddf5baa649c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.625185 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9a811059-77da-436c-95e6-fddf5baa649c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.636325 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hclph\" (UniqueName: \"kubernetes.io/projected/9a811059-77da-436c-95e6-fddf5baa649c-kube-api-access-hclph\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.645399 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a811059-77da-436c-95e6-fddf5baa649c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.666997 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.786190 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d0186d-7eca-48a0-9cc8-56ce4d1caa38" path="/var/lib/kubelet/pods/67d0186d-7eca-48a0-9cc8-56ce4d1caa38/volumes" Nov 25 15:31:20 crc kubenswrapper[4965]: I1125 15:31:20.787449 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="739d03f5-20b2-4c12-9f3e-fbe795ec890d" path="/var/lib/kubelet/pods/739d03f5-20b2-4c12-9f3e-fbe795ec890d/volumes" Nov 25 15:31:21 crc kubenswrapper[4965]: I1125 15:31:21.128686 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 15:31:21 crc kubenswrapper[4965]: I1125 15:31:21.281385 4965 generic.go:334] "Generic (PLEG): container finished" podID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerID="7dda8b4f136bec9ed6be795a5bfcdce3872b0f1a71a5bf0c9a104abd9c165dfa" exitCode=0 Nov 25 15:31:21 crc kubenswrapper[4965]: I1125 15:31:21.281526 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vncbk" event={"ID":"f2de4912-4242-4b1e-acb5-d5e165f881ce","Type":"ContainerDied","Data":"7dda8b4f136bec9ed6be795a5bfcdce3872b0f1a71a5bf0c9a104abd9c165dfa"} Nov 25 15:31:21 crc kubenswrapper[4965]: I1125 15:31:21.285842 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9a811059-77da-436c-95e6-fddf5baa649c","Type":"ContainerStarted","Data":"e4c750520f5c47e08a22c9f24ab0eaaf1faf3568b10fef50433e04f0050b49ee"} Nov 25 15:31:21 crc kubenswrapper[4965]: I1125 15:31:21.772230 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:31:21 crc kubenswrapper[4965]: E1125 15:31:21.772992 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:31:22 crc kubenswrapper[4965]: I1125 15:31:22.295729 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vncbk" event={"ID":"f2de4912-4242-4b1e-acb5-d5e165f881ce","Type":"ContainerStarted","Data":"98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67"} Nov 25 15:31:22 crc kubenswrapper[4965]: I1125 15:31:22.299355 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5400ed8-9880-47b3-b8e7-5de35a2c7e00","Type":"ContainerStarted","Data":"7381e0b7596d07a7d5587d02fd0f2a0b7bff01c6350af8303a9215f2cdaed663"} Nov 25 15:31:22 crc kubenswrapper[4965]: I1125 15:31:22.317047 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vncbk" podStartSLOduration=2.604295852 podStartE2EDuration="8.317031214s" podCreationTimestamp="2025-11-25 15:31:14 +0000 UTC" firstStartedPulling="2025-11-25 15:31:16.21279472 +0000 UTC m=+1621.180388466" lastFinishedPulling="2025-11-25 15:31:21.925530082 +0000 UTC m=+1626.893123828" observedRunningTime="2025-11-25 15:31:22.313082857 +0000 UTC m=+1627.280676603" watchObservedRunningTime="2025-11-25 15:31:22.317031214 +0000 UTC m=+1627.284624960" Nov 25 15:31:23 crc kubenswrapper[4965]: I1125 15:31:23.310170 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9a811059-77da-436c-95e6-fddf5baa649c","Type":"ContainerStarted","Data":"f265ee6eeb0052f44ebd81743a64afeaabe8334aab51afe0857c37853feb6973"} Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.047563 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qm2ks"] Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.050040 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.067417 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qm2ks"] Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.200867 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c585ba-c0e6-4733-bf34-424790b0fafc-catalog-content\") pod \"community-operators-qm2ks\" (UID: \"81c585ba-c0e6-4733-bf34-424790b0fafc\") " pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.201084 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c585ba-c0e6-4733-bf34-424790b0fafc-utilities\") pod \"community-operators-qm2ks\" (UID: \"81c585ba-c0e6-4733-bf34-424790b0fafc\") " pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.201157 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxgll\" (UniqueName: \"kubernetes.io/projected/81c585ba-c0e6-4733-bf34-424790b0fafc-kube-api-access-gxgll\") pod \"community-operators-qm2ks\" (UID: \"81c585ba-c0e6-4733-bf34-424790b0fafc\") " pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.235569 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.236033 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.283887 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.302309 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c585ba-c0e6-4733-bf34-424790b0fafc-utilities\") pod \"community-operators-qm2ks\" (UID: \"81c585ba-c0e6-4733-bf34-424790b0fafc\") " pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.302414 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxgll\" (UniqueName: \"kubernetes.io/projected/81c585ba-c0e6-4733-bf34-424790b0fafc-kube-api-access-gxgll\") pod \"community-operators-qm2ks\" (UID: \"81c585ba-c0e6-4733-bf34-424790b0fafc\") " pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.302490 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c585ba-c0e6-4733-bf34-424790b0fafc-catalog-content\") pod \"community-operators-qm2ks\" (UID: \"81c585ba-c0e6-4733-bf34-424790b0fafc\") " pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.302883 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c585ba-c0e6-4733-bf34-424790b0fafc-utilities\") pod \"community-operators-qm2ks\" (UID: \"81c585ba-c0e6-4733-bf34-424790b0fafc\") " pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.302903 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c585ba-c0e6-4733-bf34-424790b0fafc-catalog-content\") pod \"community-operators-qm2ks\" (UID: \"81c585ba-c0e6-4733-bf34-424790b0fafc\") " pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.330057 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxgll\" (UniqueName: \"kubernetes.io/projected/81c585ba-c0e6-4733-bf34-424790b0fafc-kube-api-access-gxgll\") pod \"community-operators-qm2ks\" (UID: \"81c585ba-c0e6-4733-bf34-424790b0fafc\") " pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.371374 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:25 crc kubenswrapper[4965]: W1125 15:31:25.872204 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c585ba_c0e6_4733_bf34_424790b0fafc.slice/crio-4ed8082debf7fc492164122e33deb090a7565a447d9aeec4af31d2b7d2fb6d3f WatchSource:0}: Error finding container 4ed8082debf7fc492164122e33deb090a7565a447d9aeec4af31d2b7d2fb6d3f: Status 404 returned error can't find the container with id 4ed8082debf7fc492164122e33deb090a7565a447d9aeec4af31d2b7d2fb6d3f Nov 25 15:31:25 crc kubenswrapper[4965]: I1125 15:31:25.873918 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qm2ks"] Nov 25 15:31:26 crc kubenswrapper[4965]: E1125 15:31:26.225735 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c585ba_c0e6_4733_bf34_424790b0fafc.slice/crio-2bd2d829370afcad6bcfbfcb2960bd753e4794905103f2fb8e50a4cea2b89561.scope\": RecentStats: unable to find data in memory cache]" Nov 25 15:31:26 crc kubenswrapper[4965]: I1125 15:31:26.336055 4965 generic.go:334] "Generic (PLEG): container finished" podID="81c585ba-c0e6-4733-bf34-424790b0fafc" containerID="2bd2d829370afcad6bcfbfcb2960bd753e4794905103f2fb8e50a4cea2b89561" exitCode=0 Nov 25 15:31:26 crc kubenswrapper[4965]: I1125 15:31:26.336113 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2ks" event={"ID":"81c585ba-c0e6-4733-bf34-424790b0fafc","Type":"ContainerDied","Data":"2bd2d829370afcad6bcfbfcb2960bd753e4794905103f2fb8e50a4cea2b89561"} Nov 25 15:31:26 crc kubenswrapper[4965]: I1125 15:31:26.336159 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2ks" event={"ID":"81c585ba-c0e6-4733-bf34-424790b0fafc","Type":"ContainerStarted","Data":"4ed8082debf7fc492164122e33deb090a7565a447d9aeec4af31d2b7d2fb6d3f"} Nov 25 15:31:27 crc kubenswrapper[4965]: I1125 15:31:27.948441 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-r7s4r"] Nov 25 15:31:27 crc kubenswrapper[4965]: I1125 15:31:27.950419 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:27 crc kubenswrapper[4965]: I1125 15:31:27.957308 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 25 15:31:27 crc kubenswrapper[4965]: I1125 15:31:27.986908 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-r7s4r"] Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.055237 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.055303 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj6mb\" (UniqueName: \"kubernetes.io/projected/728fd734-6d25-4b37-9306-50e10e358e13-kube-api-access-gj6mb\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.055336 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.055517 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-config\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.055956 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.056074 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.165939 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-config\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.166062 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.166095 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.166135 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.166167 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj6mb\" (UniqueName: \"kubernetes.io/projected/728fd734-6d25-4b37-9306-50e10e358e13-kube-api-access-gj6mb\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.166194 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.167227 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.167874 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-config\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.168421 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.169019 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.169508 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.194045 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj6mb\" (UniqueName: \"kubernetes.io/projected/728fd734-6d25-4b37-9306-50e10e358e13-kube-api-access-gj6mb\") pod \"dnsmasq-dns-6447ccbd8f-r7s4r\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.290432 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:28 crc kubenswrapper[4965]: I1125 15:31:28.768951 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-r7s4r"] Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.220697 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd"] Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.227487 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.231290 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.231732 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x9c2x" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.231885 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.232066 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.240643 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd"] Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.314996 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.315115 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.315162 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqc85\" (UniqueName: \"kubernetes.io/projected/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-kube-api-access-dqc85\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.315190 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.416253 4965 generic.go:334] "Generic (PLEG): container finished" podID="728fd734-6d25-4b37-9306-50e10e358e13" containerID="a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12" exitCode=0 Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.416307 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" event={"ID":"728fd734-6d25-4b37-9306-50e10e358e13","Type":"ContainerDied","Data":"a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12"} Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.416336 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" event={"ID":"728fd734-6d25-4b37-9306-50e10e358e13","Type":"ContainerStarted","Data":"e046c28eadf9f0b98a8fce5b47a100f5f8e0d1c345a08571d930fb431bf86e5a"} Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.416451 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.416548 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.416588 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqc85\" (UniqueName: \"kubernetes.io/projected/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-kube-api-access-dqc85\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.416613 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.428650 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.428683 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.428855 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.449201 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqc85\" (UniqueName: \"kubernetes.io/projected/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-kube-api-access-dqc85\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:29 crc kubenswrapper[4965]: I1125 15:31:29.560219 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:31:30 crc kubenswrapper[4965]: I1125 15:31:30.491903 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" event={"ID":"728fd734-6d25-4b37-9306-50e10e358e13","Type":"ContainerStarted","Data":"eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04"} Nov 25 15:31:30 crc kubenswrapper[4965]: I1125 15:31:30.492286 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:30 crc kubenswrapper[4965]: I1125 15:31:30.523098 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" podStartSLOduration=3.523076335 podStartE2EDuration="3.523076335s" podCreationTimestamp="2025-11-25 15:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:31:30.515396866 +0000 UTC m=+1635.482990612" watchObservedRunningTime="2025-11-25 15:31:30.523076335 +0000 UTC m=+1635.490670081" Nov 25 15:31:30 crc kubenswrapper[4965]: I1125 15:31:30.579883 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd"] Nov 25 15:31:32 crc kubenswrapper[4965]: I1125 15:31:32.791791 4965 scope.go:117] "RemoveContainer" containerID="60ff6ebdcd5df07ce60550aa76f7e3b780a9bf131c02982fc5c61be769315d3b" Nov 25 15:31:33 crc kubenswrapper[4965]: I1125 15:31:33.029227 4965 scope.go:117] "RemoveContainer" containerID="21e62dea2ce289ec599ff990be43132fe7f207dc9f4fbdc15c2a9cc8680622ba" Nov 25 15:31:33 crc kubenswrapper[4965]: I1125 15:31:33.539989 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" event={"ID":"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2","Type":"ContainerStarted","Data":"81f81f777b793a43c1711655c9d05c6375db0ad6526bba5f87532753025e1618"} Nov 25 15:31:33 crc kubenswrapper[4965]: I1125 15:31:33.548860 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2ks" event={"ID":"81c585ba-c0e6-4733-bf34-424790b0fafc","Type":"ContainerStarted","Data":"73f52c1d731c3a21bb1a4678f01c92a3ea042c665a91176df19097a17db4300e"} Nov 25 15:31:34 crc kubenswrapper[4965]: I1125 15:31:34.561496 4965 generic.go:334] "Generic (PLEG): container finished" podID="81c585ba-c0e6-4733-bf34-424790b0fafc" containerID="73f52c1d731c3a21bb1a4678f01c92a3ea042c665a91176df19097a17db4300e" exitCode=0 Nov 25 15:31:34 crc kubenswrapper[4965]: I1125 15:31:34.561572 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2ks" event={"ID":"81c585ba-c0e6-4733-bf34-424790b0fafc","Type":"ContainerDied","Data":"73f52c1d731c3a21bb1a4678f01c92a3ea042c665a91176df19097a17db4300e"} Nov 25 15:31:35 crc kubenswrapper[4965]: I1125 15:31:35.327115 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:35 crc kubenswrapper[4965]: I1125 15:31:35.376139 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vncbk"] Nov 25 15:31:35 crc kubenswrapper[4965]: I1125 15:31:35.570408 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vncbk" podUID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerName="registry-server" containerID="cri-o://98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67" gracePeriod=2 Nov 25 15:31:35 crc kubenswrapper[4965]: I1125 15:31:35.772331 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:31:35 crc kubenswrapper[4965]: E1125 15:31:35.772592 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:31:37 crc kubenswrapper[4965]: I1125 15:31:37.600812 4965 generic.go:334] "Generic (PLEG): container finished" podID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerID="98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67" exitCode=0 Nov 25 15:31:37 crc kubenswrapper[4965]: I1125 15:31:37.601017 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vncbk" event={"ID":"f2de4912-4242-4b1e-acb5-d5e165f881ce","Type":"ContainerDied","Data":"98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67"} Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.293251 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.371882 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-st74t"] Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.372252 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-st74t" podUID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" containerName="dnsmasq-dns" containerID="cri-o://4fd1f88c2e262bf818afd5b86c36aa1f88445dcc8b1707479ca612416cd8d984" gracePeriod=10 Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.569370 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c5d8cf46f-msf8j"] Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.575395 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.591216 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c5d8cf46f-msf8j"] Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.618226 4965 generic.go:334] "Generic (PLEG): container finished" podID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" containerID="4fd1f88c2e262bf818afd5b86c36aa1f88445dcc8b1707479ca612416cd8d984" exitCode=0 Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.618271 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-st74t" event={"ID":"9e098e76-ffb9-40ad-9312-96f70fc8d0d7","Type":"ContainerDied","Data":"4fd1f88c2e262bf818afd5b86c36aa1f88445dcc8b1707479ca612416cd8d984"} Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.733345 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-ovsdbserver-sb\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.733447 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-config\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.733510 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hhfd\" (UniqueName: \"kubernetes.io/projected/082f410e-8793-4651-be56-a0c486eebdbc-kube-api-access-5hhfd\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.733572 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-ovsdbserver-nb\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.733632 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.733658 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-dns-svc\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.837024 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-ovsdbserver-sb\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.837104 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-config\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.837156 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hhfd\" (UniqueName: \"kubernetes.io/projected/082f410e-8793-4651-be56-a0c486eebdbc-kube-api-access-5hhfd\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.837210 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-ovsdbserver-nb\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.837270 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.837290 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-dns-svc\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.837936 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-ovsdbserver-sb\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.838162 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-dns-svc\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.838533 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-config\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.838547 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-ovsdbserver-nb\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.838748 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/082f410e-8793-4651-be56-a0c486eebdbc-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.862571 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hhfd\" (UniqueName: \"kubernetes.io/projected/082f410e-8793-4651-be56-a0c486eebdbc-kube-api-access-5hhfd\") pod \"dnsmasq-dns-6c5d8cf46f-msf8j\" (UID: \"082f410e-8793-4651-be56-a0c486eebdbc\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:38 crc kubenswrapper[4965]: I1125 15:31:38.896674 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:40 crc kubenswrapper[4965]: I1125 15:31:40.922462 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b856c5697-st74t" podUID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.182:5353: connect: connection refused" Nov 25 15:31:44 crc kubenswrapper[4965]: I1125 15:31:44.391922 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c5d8cf46f-msf8j"] Nov 25 15:31:45 crc kubenswrapper[4965]: E1125 15:31:45.237231 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67 is running failed: container process not found" containerID="98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:31:45 crc kubenswrapper[4965]: E1125 15:31:45.237711 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67 is running failed: container process not found" containerID="98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:31:45 crc kubenswrapper[4965]: E1125 15:31:45.238181 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67 is running failed: container process not found" containerID="98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:31:45 crc kubenswrapper[4965]: E1125 15:31:45.238222 4965 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-vncbk" podUID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerName="registry-server" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.473639 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.501193 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.587162 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-sb\") pod \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.587197 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-nb\") pod \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.587264 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-755jh\" (UniqueName: \"kubernetes.io/projected/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-kube-api-access-755jh\") pod \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.587332 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-config\") pod \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.587394 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-dns-svc\") pod \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\" (UID: \"9e098e76-ffb9-40ad-9312-96f70fc8d0d7\") " Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.595370 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-kube-api-access-755jh" (OuterVolumeSpecName: "kube-api-access-755jh") pod "9e098e76-ffb9-40ad-9312-96f70fc8d0d7" (UID: "9e098e76-ffb9-40ad-9312-96f70fc8d0d7"). InnerVolumeSpecName "kube-api-access-755jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.639799 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e098e76-ffb9-40ad-9312-96f70fc8d0d7" (UID: "9e098e76-ffb9-40ad-9312-96f70fc8d0d7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.650627 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-config" (OuterVolumeSpecName: "config") pod "9e098e76-ffb9-40ad-9312-96f70fc8d0d7" (UID: "9e098e76-ffb9-40ad-9312-96f70fc8d0d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.688227 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-st74t" event={"ID":"9e098e76-ffb9-40ad-9312-96f70fc8d0d7","Type":"ContainerDied","Data":"0004dca969bce8ac84ed56b8a60152c55a3a94af88c92c8b849843aa6cfa17f1"} Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.688275 4965 scope.go:117] "RemoveContainer" containerID="4fd1f88c2e262bf818afd5b86c36aa1f88445dcc8b1707479ca612416cd8d984" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.688505 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-st74t" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.688543 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-utilities\") pod \"f2de4912-4242-4b1e-acb5-d5e165f881ce\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.688605 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-catalog-content\") pod \"f2de4912-4242-4b1e-acb5-d5e165f881ce\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.690611 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-utilities" (OuterVolumeSpecName: "utilities") pod "f2de4912-4242-4b1e-acb5-d5e165f881ce" (UID: "f2de4912-4242-4b1e-acb5-d5e165f881ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.691336 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9e098e76-ffb9-40ad-9312-96f70fc8d0d7" (UID: "9e098e76-ffb9-40ad-9312-96f70fc8d0d7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.691598 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e098e76-ffb9-40ad-9312-96f70fc8d0d7" (UID: "9e098e76-ffb9-40ad-9312-96f70fc8d0d7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.691929 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2hnx\" (UniqueName: \"kubernetes.io/projected/f2de4912-4242-4b1e-acb5-d5e165f881ce-kube-api-access-m2hnx\") pod \"f2de4912-4242-4b1e-acb5-d5e165f881ce\" (UID: \"f2de4912-4242-4b1e-acb5-d5e165f881ce\") " Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.692711 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-755jh\" (UniqueName: \"kubernetes.io/projected/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-kube-api-access-755jh\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.692734 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.692748 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.692759 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.692774 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.692787 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e098e76-ffb9-40ad-9312-96f70fc8d0d7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.695681 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2de4912-4242-4b1e-acb5-d5e165f881ce-kube-api-access-m2hnx" (OuterVolumeSpecName: "kube-api-access-m2hnx") pod "f2de4912-4242-4b1e-acb5-d5e165f881ce" (UID: "f2de4912-4242-4b1e-acb5-d5e165f881ce"). InnerVolumeSpecName "kube-api-access-m2hnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.700519 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" event={"ID":"082f410e-8793-4651-be56-a0c486eebdbc","Type":"ContainerStarted","Data":"2dafe97356b4915714155bab0fd2e1b670a8b232eb5eab0731dc937d6c41c840"} Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.705191 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vncbk" event={"ID":"f2de4912-4242-4b1e-acb5-d5e165f881ce","Type":"ContainerDied","Data":"c37485e8549aecb8c9addbeb88eb8431e748d3abf0423323638d7a5762fdf15f"} Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.705302 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vncbk" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.750266 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2de4912-4242-4b1e-acb5-d5e165f881ce" (UID: "f2de4912-4242-4b1e-acb5-d5e165f881ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.794601 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2hnx\" (UniqueName: \"kubernetes.io/projected/f2de4912-4242-4b1e-acb5-d5e165f881ce-kube-api-access-m2hnx\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:46 crc kubenswrapper[4965]: I1125 15:31:46.794848 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2de4912-4242-4b1e-acb5-d5e165f881ce-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:47 crc kubenswrapper[4965]: I1125 15:31:47.013026 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-st74t"] Nov 25 15:31:47 crc kubenswrapper[4965]: I1125 15:31:47.031694 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-st74t"] Nov 25 15:31:47 crc kubenswrapper[4965]: I1125 15:31:47.039530 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vncbk"] Nov 25 15:31:47 crc kubenswrapper[4965]: I1125 15:31:47.047339 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vncbk"] Nov 25 15:31:47 crc kubenswrapper[4965]: I1125 15:31:47.446126 4965 scope.go:117] "RemoveContainer" containerID="26a860b1750ff11e609f4b8aa8615f1dbc39a54e5475d8be15e6155680b112fc" Nov 25 15:31:47 crc kubenswrapper[4965]: I1125 15:31:47.771428 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:31:47 crc kubenswrapper[4965]: E1125 15:31:47.771834 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:31:47 crc kubenswrapper[4965]: I1125 15:31:47.868210 4965 scope.go:117] "RemoveContainer" containerID="98b5d0defb2e04eec514571dc77d44e972f63e65e5eff468b9129de0e0c4da67" Nov 25 15:31:48 crc kubenswrapper[4965]: I1125 15:31:48.026536 4965 scope.go:117] "RemoveContainer" containerID="7dda8b4f136bec9ed6be795a5bfcdce3872b0f1a71a5bf0c9a104abd9c165dfa" Nov 25 15:31:48 crc kubenswrapper[4965]: I1125 15:31:48.074945 4965 scope.go:117] "RemoveContainer" containerID="8ba0a1d4ba6eac7b4c3e01b26df255a7dd4bb81c20d1238c46a91ca2c5694cf9" Nov 25 15:31:48 crc kubenswrapper[4965]: I1125 15:31:48.730311 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm2ks" event={"ID":"81c585ba-c0e6-4733-bf34-424790b0fafc","Type":"ContainerStarted","Data":"6e4df9562f94ea8b915282cc51f6b10f059060f1720799aaeb5c130a277ffffb"} Nov 25 15:31:48 crc kubenswrapper[4965]: I1125 15:31:48.733011 4965 generic.go:334] "Generic (PLEG): container finished" podID="082f410e-8793-4651-be56-a0c486eebdbc" containerID="fa03cd9d33fed1648fc449af04f01755524c0c278bd3ecd5dc7ed9c359b58022" exitCode=0 Nov 25 15:31:48 crc kubenswrapper[4965]: I1125 15:31:48.733106 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" event={"ID":"082f410e-8793-4651-be56-a0c486eebdbc","Type":"ContainerDied","Data":"fa03cd9d33fed1648fc449af04f01755524c0c278bd3ecd5dc7ed9c359b58022"} Nov 25 15:31:48 crc kubenswrapper[4965]: I1125 15:31:48.743560 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" event={"ID":"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2","Type":"ContainerStarted","Data":"54333ff35d46b21292f53f7d7488e1bb6a4e28568db6acf2ea4fa37ccfdbe903"} Nov 25 15:31:48 crc kubenswrapper[4965]: I1125 15:31:48.764865 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qm2ks" podStartSLOduration=2.235281438 podStartE2EDuration="23.764822931s" podCreationTimestamp="2025-11-25 15:31:25 +0000 UTC" firstStartedPulling="2025-11-25 15:31:26.339312442 +0000 UTC m=+1631.306906188" lastFinishedPulling="2025-11-25 15:31:47.868853935 +0000 UTC m=+1652.836447681" observedRunningTime="2025-11-25 15:31:48.759186338 +0000 UTC m=+1653.726780104" watchObservedRunningTime="2025-11-25 15:31:48.764822931 +0000 UTC m=+1653.732416687" Nov 25 15:31:48 crc kubenswrapper[4965]: I1125 15:31:48.795281 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" path="/var/lib/kubelet/pods/9e098e76-ffb9-40ad-9312-96f70fc8d0d7/volumes" Nov 25 15:31:48 crc kubenswrapper[4965]: I1125 15:31:48.800906 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" podStartSLOduration=4.961286428 podStartE2EDuration="19.800589918s" podCreationTimestamp="2025-11-25 15:31:29 +0000 UTC" firstStartedPulling="2025-11-25 15:31:33.029169174 +0000 UTC m=+1637.996762920" lastFinishedPulling="2025-11-25 15:31:47.868472664 +0000 UTC m=+1652.836066410" observedRunningTime="2025-11-25 15:31:48.781895697 +0000 UTC m=+1653.749489443" watchObservedRunningTime="2025-11-25 15:31:48.800589918 +0000 UTC m=+1653.768183664" Nov 25 15:31:48 crc kubenswrapper[4965]: I1125 15:31:48.802298 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2de4912-4242-4b1e-acb5-d5e165f881ce" path="/var/lib/kubelet/pods/f2de4912-4242-4b1e-acb5-d5e165f881ce/volumes" Nov 25 15:31:49 crc kubenswrapper[4965]: I1125 15:31:49.753333 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" event={"ID":"082f410e-8793-4651-be56-a0c486eebdbc","Type":"ContainerStarted","Data":"413e11f2ea0ddcdd342ef9eabea775bedc8dfa90af4e3814343d40f6600f0f7c"} Nov 25 15:31:50 crc kubenswrapper[4965]: I1125 15:31:50.760922 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:50 crc kubenswrapper[4965]: I1125 15:31:50.922767 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b856c5697-st74t" podUID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.182:5353: i/o timeout" Nov 25 15:31:54 crc kubenswrapper[4965]: I1125 15:31:54.817615 4965 generic.go:334] "Generic (PLEG): container finished" podID="d5400ed8-9880-47b3-b8e7-5de35a2c7e00" containerID="7381e0b7596d07a7d5587d02fd0f2a0b7bff01c6350af8303a9215f2cdaed663" exitCode=0 Nov 25 15:31:54 crc kubenswrapper[4965]: I1125 15:31:54.817811 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5400ed8-9880-47b3-b8e7-5de35a2c7e00","Type":"ContainerDied","Data":"7381e0b7596d07a7d5587d02fd0f2a0b7bff01c6350af8303a9215f2cdaed663"} Nov 25 15:31:54 crc kubenswrapper[4965]: I1125 15:31:54.854858 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" podStartSLOduration=16.854834678 podStartE2EDuration="16.854834678s" podCreationTimestamp="2025-11-25 15:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:31:49.777795111 +0000 UTC m=+1654.745388867" watchObservedRunningTime="2025-11-25 15:31:54.854834678 +0000 UTC m=+1659.822428424" Nov 25 15:31:55 crc kubenswrapper[4965]: I1125 15:31:55.372180 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:55 crc kubenswrapper[4965]: I1125 15:31:55.372243 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:31:55 crc kubenswrapper[4965]: I1125 15:31:55.829913 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5400ed8-9880-47b3-b8e7-5de35a2c7e00","Type":"ContainerStarted","Data":"b87027029d0882379e2be96a5d3ce627dd5dcbc554c69e635893f5ec634b4a93"} Nov 25 15:31:55 crc kubenswrapper[4965]: I1125 15:31:55.831041 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 25 15:31:55 crc kubenswrapper[4965]: I1125 15:31:55.831791 4965 generic.go:334] "Generic (PLEG): container finished" podID="9a811059-77da-436c-95e6-fddf5baa649c" containerID="f265ee6eeb0052f44ebd81743a64afeaabe8334aab51afe0857c37853feb6973" exitCode=0 Nov 25 15:31:55 crc kubenswrapper[4965]: I1125 15:31:55.831863 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9a811059-77da-436c-95e6-fddf5baa649c","Type":"ContainerDied","Data":"f265ee6eeb0052f44ebd81743a64afeaabe8334aab51afe0857c37853feb6973"} Nov 25 15:31:55 crc kubenswrapper[4965]: I1125 15:31:55.910909 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.910857182 podStartE2EDuration="36.910857182s" podCreationTimestamp="2025-11-25 15:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:31:55.875593419 +0000 UTC m=+1660.843187175" watchObservedRunningTime="2025-11-25 15:31:55.910857182 +0000 UTC m=+1660.878450928" Nov 25 15:31:56 crc kubenswrapper[4965]: I1125 15:31:56.414791 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qm2ks" podUID="81c585ba-c0e6-4733-bf34-424790b0fafc" containerName="registry-server" probeResult="failure" output=< Nov 25 15:31:56 crc kubenswrapper[4965]: timeout: failed to connect service ":50051" within 1s Nov 25 15:31:56 crc kubenswrapper[4965]: > Nov 25 15:31:56 crc kubenswrapper[4965]: I1125 15:31:56.855871 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9a811059-77da-436c-95e6-fddf5baa649c","Type":"ContainerStarted","Data":"b7871121c4ad31d799515b8b283f011ab73ce4dca04815d770b26334b68ec667"} Nov 25 15:31:56 crc kubenswrapper[4965]: I1125 15:31:56.856502 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:31:56 crc kubenswrapper[4965]: I1125 15:31:56.886553 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.886522852 podStartE2EDuration="36.886522852s" podCreationTimestamp="2025-11-25 15:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:31:56.884110246 +0000 UTC m=+1661.851704022" watchObservedRunningTime="2025-11-25 15:31:56.886522852 +0000 UTC m=+1661.854116598" Nov 25 15:31:58 crc kubenswrapper[4965]: I1125 15:31:58.898701 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c5d8cf46f-msf8j" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.009124 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-r7s4r"] Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.009373 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" podUID="728fd734-6d25-4b37-9306-50e10e358e13" containerName="dnsmasq-dns" containerID="cri-o://eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04" gracePeriod=10 Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.526825 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.708780 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-sb\") pod \"728fd734-6d25-4b37-9306-50e10e358e13\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.709321 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-openstack-edpm-ipam\") pod \"728fd734-6d25-4b37-9306-50e10e358e13\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.709409 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-dns-svc\") pod \"728fd734-6d25-4b37-9306-50e10e358e13\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.709472 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-nb\") pod \"728fd734-6d25-4b37-9306-50e10e358e13\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.709574 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj6mb\" (UniqueName: \"kubernetes.io/projected/728fd734-6d25-4b37-9306-50e10e358e13-kube-api-access-gj6mb\") pod \"728fd734-6d25-4b37-9306-50e10e358e13\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.709640 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-config\") pod \"728fd734-6d25-4b37-9306-50e10e358e13\" (UID: \"728fd734-6d25-4b37-9306-50e10e358e13\") " Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.716562 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728fd734-6d25-4b37-9306-50e10e358e13-kube-api-access-gj6mb" (OuterVolumeSpecName: "kube-api-access-gj6mb") pod "728fd734-6d25-4b37-9306-50e10e358e13" (UID: "728fd734-6d25-4b37-9306-50e10e358e13"). InnerVolumeSpecName "kube-api-access-gj6mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.772039 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:31:59 crc kubenswrapper[4965]: E1125 15:31:59.772384 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.785854 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "728fd734-6d25-4b37-9306-50e10e358e13" (UID: "728fd734-6d25-4b37-9306-50e10e358e13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.796506 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "728fd734-6d25-4b37-9306-50e10e358e13" (UID: "728fd734-6d25-4b37-9306-50e10e358e13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.809561 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-config" (OuterVolumeSpecName: "config") pod "728fd734-6d25-4b37-9306-50e10e358e13" (UID: "728fd734-6d25-4b37-9306-50e10e358e13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.812647 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "728fd734-6d25-4b37-9306-50e10e358e13" (UID: "728fd734-6d25-4b37-9306-50e10e358e13"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.818247 4965 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.818294 4965 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.818310 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.818324 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj6mb\" (UniqueName: \"kubernetes.io/projected/728fd734-6d25-4b37-9306-50e10e358e13-kube-api-access-gj6mb\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.818343 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.824194 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "728fd734-6d25-4b37-9306-50e10e358e13" (UID: "728fd734-6d25-4b37-9306-50e10e358e13"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.882084 4965 generic.go:334] "Generic (PLEG): container finished" podID="728fd734-6d25-4b37-9306-50e10e358e13" containerID="eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04" exitCode=0 Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.882133 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" event={"ID":"728fd734-6d25-4b37-9306-50e10e358e13","Type":"ContainerDied","Data":"eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04"} Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.882164 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" event={"ID":"728fd734-6d25-4b37-9306-50e10e358e13","Type":"ContainerDied","Data":"e046c28eadf9f0b98a8fce5b47a100f5f8e0d1c345a08571d930fb431bf86e5a"} Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.882184 4965 scope.go:117] "RemoveContainer" containerID="eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.882320 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-r7s4r" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.907571 4965 scope.go:117] "RemoveContainer" containerID="a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.920492 4965 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/728fd734-6d25-4b37-9306-50e10e358e13-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.925027 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-r7s4r"] Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.931885 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-r7s4r"] Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.948629 4965 scope.go:117] "RemoveContainer" containerID="eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04" Nov 25 15:31:59 crc kubenswrapper[4965]: E1125 15:31:59.949052 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04\": container with ID starting with eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04 not found: ID does not exist" containerID="eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.949096 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04"} err="failed to get container status \"eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04\": rpc error: code = NotFound desc = could not find container \"eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04\": container with ID starting with eb02b2f66959378436ddf671afa7672a43a450d0dd631e106c0fbbc630eb3b04 not found: ID does not exist" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.949125 4965 scope.go:117] "RemoveContainer" containerID="a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12" Nov 25 15:31:59 crc kubenswrapper[4965]: E1125 15:31:59.949528 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12\": container with ID starting with a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12 not found: ID does not exist" containerID="a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12" Nov 25 15:31:59 crc kubenswrapper[4965]: I1125 15:31:59.949543 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12"} err="failed to get container status \"a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12\": rpc error: code = NotFound desc = could not find container \"a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12\": container with ID starting with a2c62e541f48facd471388367ef2119b3a9065dd279f9c6dd2c7a663e3822e12 not found: ID does not exist" Nov 25 15:32:00 crc kubenswrapper[4965]: I1125 15:32:00.782809 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728fd734-6d25-4b37-9306-50e10e358e13" path="/var/lib/kubelet/pods/728fd734-6d25-4b37-9306-50e10e358e13/volumes" Nov 25 15:32:00 crc kubenswrapper[4965]: I1125 15:32:00.892608 4965 generic.go:334] "Generic (PLEG): container finished" podID="d808d3ad-65f1-4019-a1d1-5d0b9afac8c2" containerID="54333ff35d46b21292f53f7d7488e1bb6a4e28568db6acf2ea4fa37ccfdbe903" exitCode=0 Nov 25 15:32:00 crc kubenswrapper[4965]: I1125 15:32:00.892679 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" event={"ID":"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2","Type":"ContainerDied","Data":"54333ff35d46b21292f53f7d7488e1bb6a4e28568db6acf2ea4fa37ccfdbe903"} Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.522397 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.677185 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-inventory\") pod \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.677319 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-ssh-key\") pod \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.677383 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqc85\" (UniqueName: \"kubernetes.io/projected/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-kube-api-access-dqc85\") pod \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.677593 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-repo-setup-combined-ca-bundle\") pod \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\" (UID: \"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2\") " Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.687030 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d808d3ad-65f1-4019-a1d1-5d0b9afac8c2" (UID: "d808d3ad-65f1-4019-a1d1-5d0b9afac8c2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.687115 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-kube-api-access-dqc85" (OuterVolumeSpecName: "kube-api-access-dqc85") pod "d808d3ad-65f1-4019-a1d1-5d0b9afac8c2" (UID: "d808d3ad-65f1-4019-a1d1-5d0b9afac8c2"). InnerVolumeSpecName "kube-api-access-dqc85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.708231 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d808d3ad-65f1-4019-a1d1-5d0b9afac8c2" (UID: "d808d3ad-65f1-4019-a1d1-5d0b9afac8c2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.712035 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-inventory" (OuterVolumeSpecName: "inventory") pod "d808d3ad-65f1-4019-a1d1-5d0b9afac8c2" (UID: "d808d3ad-65f1-4019-a1d1-5d0b9afac8c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.779343 4965 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.779381 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqc85\" (UniqueName: \"kubernetes.io/projected/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-kube-api-access-dqc85\") on node \"crc\" DevicePath \"\"" Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.779400 4965 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.779409 4965 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d808d3ad-65f1-4019-a1d1-5d0b9afac8c2-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.911155 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" event={"ID":"d808d3ad-65f1-4019-a1d1-5d0b9afac8c2","Type":"ContainerDied","Data":"81f81f777b793a43c1711655c9d05c6375db0ad6526bba5f87532753025e1618"} Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.911198 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f81f777b793a43c1711655c9d05c6375db0ad6526bba5f87532753025e1618" Nov 25 15:32:02 crc kubenswrapper[4965]: I1125 15:32:02.911258 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.092740 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5"] Nov 25 15:32:03 crc kubenswrapper[4965]: E1125 15:32:03.093310 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerName="extract-utilities" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093326 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerName="extract-utilities" Nov 25 15:32:03 crc kubenswrapper[4965]: E1125 15:32:03.093342 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerName="extract-content" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093349 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerName="extract-content" Nov 25 15:32:03 crc kubenswrapper[4965]: E1125 15:32:03.093362 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728fd734-6d25-4b37-9306-50e10e358e13" containerName="init" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093370 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="728fd734-6d25-4b37-9306-50e10e358e13" containerName="init" Nov 25 15:32:03 crc kubenswrapper[4965]: E1125 15:32:03.093384 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728fd734-6d25-4b37-9306-50e10e358e13" containerName="dnsmasq-dns" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093389 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="728fd734-6d25-4b37-9306-50e10e358e13" containerName="dnsmasq-dns" Nov 25 15:32:03 crc kubenswrapper[4965]: E1125 15:32:03.093408 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" containerName="dnsmasq-dns" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093414 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" containerName="dnsmasq-dns" Nov 25 15:32:03 crc kubenswrapper[4965]: E1125 15:32:03.093427 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" containerName="init" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093433 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" containerName="init" Nov 25 15:32:03 crc kubenswrapper[4965]: E1125 15:32:03.093444 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerName="registry-server" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093450 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerName="registry-server" Nov 25 15:32:03 crc kubenswrapper[4965]: E1125 15:32:03.093461 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d808d3ad-65f1-4019-a1d1-5d0b9afac8c2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093469 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d808d3ad-65f1-4019-a1d1-5d0b9afac8c2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093643 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="d808d3ad-65f1-4019-a1d1-5d0b9afac8c2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093659 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2de4912-4242-4b1e-acb5-d5e165f881ce" containerName="registry-server" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093672 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e098e76-ffb9-40ad-9312-96f70fc8d0d7" containerName="dnsmasq-dns" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.093684 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="728fd734-6d25-4b37-9306-50e10e358e13" containerName="dnsmasq-dns" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.094256 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.099416 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.099581 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.099654 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x9c2x" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.099668 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.127714 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5"] Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.195929 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8d9x\" (UniqueName: \"kubernetes.io/projected/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-kube-api-access-h8d9x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.196004 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.196029 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.196114 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.297421 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8d9x\" (UniqueName: \"kubernetes.io/projected/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-kube-api-access-h8d9x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.297485 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.297520 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.297608 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.302794 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.303532 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.314241 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.316649 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8d9x\" (UniqueName: \"kubernetes.io/projected/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-kube-api-access-h8d9x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:03 crc kubenswrapper[4965]: I1125 15:32:03.414404 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:32:04 crc kubenswrapper[4965]: I1125 15:32:03.997834 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5"] Nov 25 15:32:04 crc kubenswrapper[4965]: I1125 15:32:04.944691 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" event={"ID":"fbce69a8-d42e-498d-bbb8-7d98e9b1790e","Type":"ContainerStarted","Data":"853580402a00d5ea539736e380549b5f1b3b6fecb9e70e2d2ae0c41835a9353b"} Nov 25 15:32:05 crc kubenswrapper[4965]: I1125 15:32:05.426498 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:32:05 crc kubenswrapper[4965]: I1125 15:32:05.483459 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qm2ks" Nov 25 15:32:05 crc kubenswrapper[4965]: I1125 15:32:05.578208 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qm2ks"] Nov 25 15:32:05 crc kubenswrapper[4965]: I1125 15:32:05.677634 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6n7bt"] Nov 25 15:32:05 crc kubenswrapper[4965]: I1125 15:32:05.677924 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6n7bt" podUID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerName="registry-server" containerID="cri-o://cc76d35d55b940b2deb1af0e8a8c6a9d2a2eead59cef8e294e43e5290bee8048" gracePeriod=2 Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:05.954269 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" event={"ID":"fbce69a8-d42e-498d-bbb8-7d98e9b1790e","Type":"ContainerStarted","Data":"320df03bf90a91135cbf1f9a0eb5f7752d784b53daa481eec255d0ac44690182"} Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:05.959477 4965 generic.go:334] "Generic (PLEG): container finished" podID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerID="cc76d35d55b940b2deb1af0e8a8c6a9d2a2eead59cef8e294e43e5290bee8048" exitCode=0 Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:05.959552 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7bt" event={"ID":"42408875-e2d8-4537-85c8-aa2f8fe58cc0","Type":"ContainerDied","Data":"cc76d35d55b940b2deb1af0e8a8c6a9d2a2eead59cef8e294e43e5290bee8048"} Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:05.980306 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" podStartSLOduration=1.857707727 podStartE2EDuration="2.980286877s" podCreationTimestamp="2025-11-25 15:32:03 +0000 UTC" firstStartedPulling="2025-11-25 15:32:03.990151356 +0000 UTC m=+1668.957745102" lastFinishedPulling="2025-11-25 15:32:05.112730506 +0000 UTC m=+1670.080324252" observedRunningTime="2025-11-25 15:32:05.968573787 +0000 UTC m=+1670.936167533" watchObservedRunningTime="2025-11-25 15:32:05.980286877 +0000 UTC m=+1670.947880623" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.256174 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.366241 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-utilities\") pod \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.366336 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-catalog-content\") pod \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.366365 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlvtq\" (UniqueName: \"kubernetes.io/projected/42408875-e2d8-4537-85c8-aa2f8fe58cc0-kube-api-access-dlvtq\") pod \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\" (UID: \"42408875-e2d8-4537-85c8-aa2f8fe58cc0\") " Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.366982 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-utilities" (OuterVolumeSpecName: "utilities") pod "42408875-e2d8-4537-85c8-aa2f8fe58cc0" (UID: "42408875-e2d8-4537-85c8-aa2f8fe58cc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.373176 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42408875-e2d8-4537-85c8-aa2f8fe58cc0-kube-api-access-dlvtq" (OuterVolumeSpecName: "kube-api-access-dlvtq") pod "42408875-e2d8-4537-85c8-aa2f8fe58cc0" (UID: "42408875-e2d8-4537-85c8-aa2f8fe58cc0"). InnerVolumeSpecName "kube-api-access-dlvtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.405738 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42408875-e2d8-4537-85c8-aa2f8fe58cc0" (UID: "42408875-e2d8-4537-85c8-aa2f8fe58cc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.468514 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.468546 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42408875-e2d8-4537-85c8-aa2f8fe58cc0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.468560 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlvtq\" (UniqueName: \"kubernetes.io/projected/42408875-e2d8-4537-85c8-aa2f8fe58cc0-kube-api-access-dlvtq\") on node \"crc\" DevicePath \"\"" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.972111 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7bt" event={"ID":"42408875-e2d8-4537-85c8-aa2f8fe58cc0","Type":"ContainerDied","Data":"930ced5038582aded1bb4660fc30f4c2a1692af13300cd441fe36a627838bc95"} Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.972182 4965 scope.go:117] "RemoveContainer" containerID="cc76d35d55b940b2deb1af0e8a8c6a9d2a2eead59cef8e294e43e5290bee8048" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:06.972515 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6n7bt" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:07.009124 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6n7bt"] Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:07.024322 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6n7bt"] Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:07.034502 4965 scope.go:117] "RemoveContainer" containerID="c29131e8d88f0cb759c0f5602dcf3106a9c78a003e8bd3f1967c291992e314bf" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:07.221366 4965 scope.go:117] "RemoveContainer" containerID="17e7f2030e6d2bbe23edc5e3740712906e6749b8e21641d1727e480e5d94b5fd" Nov 25 15:32:08 crc kubenswrapper[4965]: I1125 15:32:08.786116 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" path="/var/lib/kubelet/pods/42408875-e2d8-4537-85c8-aa2f8fe58cc0/volumes" Nov 25 15:32:09 crc kubenswrapper[4965]: I1125 15:32:09.677730 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d5400ed8-9880-47b3-b8e7-5de35a2c7e00" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.192:5671: connect: connection refused" Nov 25 15:32:10 crc kubenswrapper[4965]: I1125 15:32:10.670262 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 25 15:32:12 crc kubenswrapper[4965]: I1125 15:32:12.772548 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:32:12 crc kubenswrapper[4965]: E1125 15:32:12.773247 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:32:19 crc kubenswrapper[4965]: I1125 15:32:19.676922 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 25 15:32:24 crc kubenswrapper[4965]: I1125 15:32:24.771942 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:32:24 crc kubenswrapper[4965]: E1125 15:32:24.772793 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:32:33 crc kubenswrapper[4965]: I1125 15:32:33.286142 4965 scope.go:117] "RemoveContainer" containerID="5d94320ba1d71d2e918a3a347c3890dba983d183d197427156eb60aa5b159774" Nov 25 15:32:39 crc kubenswrapper[4965]: I1125 15:32:39.771413 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:32:39 crc kubenswrapper[4965]: E1125 15:32:39.772454 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:32:51 crc kubenswrapper[4965]: I1125 15:32:51.771218 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:32:51 crc kubenswrapper[4965]: E1125 15:32:51.771946 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:33:02 crc kubenswrapper[4965]: I1125 15:33:02.775648 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:33:02 crc kubenswrapper[4965]: E1125 15:33:02.777389 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:33:15 crc kubenswrapper[4965]: I1125 15:33:15.771269 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:33:15 crc kubenswrapper[4965]: E1125 15:33:15.771874 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:33:26 crc kubenswrapper[4965]: I1125 15:33:26.777228 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:33:26 crc kubenswrapper[4965]: E1125 15:33:26.777916 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:33:40 crc kubenswrapper[4965]: I1125 15:33:40.772897 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:33:40 crc kubenswrapper[4965]: E1125 15:33:40.773757 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:33:51 crc kubenswrapper[4965]: I1125 15:33:51.771913 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:33:51 crc kubenswrapper[4965]: E1125 15:33:51.773716 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:34:03 crc kubenswrapper[4965]: I1125 15:34:03.771560 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:34:03 crc kubenswrapper[4965]: E1125 15:34:03.772440 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:34:18 crc kubenswrapper[4965]: I1125 15:34:18.773756 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:34:18 crc kubenswrapper[4965]: E1125 15:34:18.774761 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:34:30 crc kubenswrapper[4965]: I1125 15:34:30.072923 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:34:30 crc kubenswrapper[4965]: E1125 15:34:30.073796 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:34:44 crc kubenswrapper[4965]: I1125 15:34:44.771670 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:34:44 crc kubenswrapper[4965]: E1125 15:34:44.772310 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:34:57 crc kubenswrapper[4965]: I1125 15:34:57.771645 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:34:57 crc kubenswrapper[4965]: E1125 15:34:57.772400 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:35:10 crc kubenswrapper[4965]: I1125 15:35:10.771888 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:35:10 crc kubenswrapper[4965]: E1125 15:35:10.772788 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:35:21 crc kubenswrapper[4965]: I1125 15:35:21.774067 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:35:21 crc kubenswrapper[4965]: E1125 15:35:21.774748 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:35:24 crc kubenswrapper[4965]: I1125 15:35:24.107615 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-krr6h"] Nov 25 15:35:24 crc kubenswrapper[4965]: I1125 15:35:24.115539 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-krr6h"] Nov 25 15:35:24 crc kubenswrapper[4965]: I1125 15:35:24.787288 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="656770af-f30e-490a-9987-71cc29c5e278" path="/var/lib/kubelet/pods/656770af-f30e-490a-9987-71cc29c5e278/volumes" Nov 25 15:35:25 crc kubenswrapper[4965]: I1125 15:35:25.043204 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a38e-account-create-jh8kh"] Nov 25 15:35:25 crc kubenswrapper[4965]: I1125 15:35:25.055416 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-k2wml"] Nov 25 15:35:25 crc kubenswrapper[4965]: I1125 15:35:25.067431 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-f6bkw"] Nov 25 15:35:25 crc kubenswrapper[4965]: I1125 15:35:25.078260 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ae27-account-create-2ng9k"] Nov 25 15:35:25 crc kubenswrapper[4965]: I1125 15:35:25.087711 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-k2wml"] Nov 25 15:35:25 crc kubenswrapper[4965]: I1125 15:35:25.096703 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ae27-account-create-2ng9k"] Nov 25 15:35:25 crc kubenswrapper[4965]: I1125 15:35:25.106292 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a38e-account-create-jh8kh"] Nov 25 15:35:25 crc kubenswrapper[4965]: I1125 15:35:25.115240 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-f6bkw"] Nov 25 15:35:25 crc kubenswrapper[4965]: I1125 15:35:25.123203 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-17b8-account-create-f8lf2"] Nov 25 15:35:25 crc kubenswrapper[4965]: I1125 15:35:25.130258 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-17b8-account-create-f8lf2"] Nov 25 15:35:26 crc kubenswrapper[4965]: I1125 15:35:26.786375 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ff1c96-a8a8-431e-ab9b-a76d3992463f" path="/var/lib/kubelet/pods/59ff1c96-a8a8-431e-ab9b-a76d3992463f/volumes" Nov 25 15:35:26 crc kubenswrapper[4965]: I1125 15:35:26.787788 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61094cee-413d-4b91-ace8-69afdbaa6226" path="/var/lib/kubelet/pods/61094cee-413d-4b91-ace8-69afdbaa6226/volumes" Nov 25 15:35:26 crc kubenswrapper[4965]: I1125 15:35:26.788610 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c898509-daeb-4566-b5b1-39f4b742b5d0" path="/var/lib/kubelet/pods/6c898509-daeb-4566-b5b1-39f4b742b5d0/volumes" Nov 25 15:35:26 crc kubenswrapper[4965]: I1125 15:35:26.789435 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93df8a0d-1ef0-4598-8253-92c5bbeb95aa" path="/var/lib/kubelet/pods/93df8a0d-1ef0-4598-8253-92c5bbeb95aa/volumes" Nov 25 15:35:26 crc kubenswrapper[4965]: I1125 15:35:26.790994 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2155fa2-3b94-4413-84de-171185b0d253" path="/var/lib/kubelet/pods/b2155fa2-3b94-4413-84de-171185b0d253/volumes" Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.059101 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4468-account-create-vm2dh"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.079051 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7113-account-create-l65nk"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.091096 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jlj9w"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.100809 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-p7fhq"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.109677 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4468-account-create-vm2dh"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.117668 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jlj9w"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.125008 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7113-account-create-l65nk"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.132480 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-p7fhq"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.139182 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-s5wpj"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.146670 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-69c1-account-create-z47b6"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.154427 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-s5wpj"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.161317 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-69c1-account-create-z47b6"] Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.768098 4965 generic.go:334] "Generic (PLEG): container finished" podID="fbce69a8-d42e-498d-bbb8-7d98e9b1790e" containerID="320df03bf90a91135cbf1f9a0eb5f7752d784b53daa481eec255d0ac44690182" exitCode=0 Nov 25 15:35:27 crc kubenswrapper[4965]: I1125 15:35:27.768171 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" event={"ID":"fbce69a8-d42e-498d-bbb8-7d98e9b1790e","Type":"ContainerDied","Data":"320df03bf90a91135cbf1f9a0eb5f7752d784b53daa481eec255d0ac44690182"} Nov 25 15:35:28 crc kubenswrapper[4965]: I1125 15:35:28.790595 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6721ea95-4d66-4dc4-b502-a4e6be931279" path="/var/lib/kubelet/pods/6721ea95-4d66-4dc4-b502-a4e6be931279/volumes" Nov 25 15:35:28 crc kubenswrapper[4965]: I1125 15:35:28.791442 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78ed68a6-1a16-4e43-86a2-7901c1c7aa35" path="/var/lib/kubelet/pods/78ed68a6-1a16-4e43-86a2-7901c1c7aa35/volumes" Nov 25 15:35:28 crc kubenswrapper[4965]: I1125 15:35:28.792514 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57d631d-ea34-407d-be71-02c7bf7bc2e4" path="/var/lib/kubelet/pods/c57d631d-ea34-407d-be71-02c7bf7bc2e4/volumes" Nov 25 15:35:28 crc kubenswrapper[4965]: I1125 15:35:28.793111 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8db14bf-f245-4b59-a601-b9319b4b2e23" path="/var/lib/kubelet/pods/c8db14bf-f245-4b59-a601-b9319b4b2e23/volumes" Nov 25 15:35:28 crc kubenswrapper[4965]: I1125 15:35:28.795767 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea9637eb-b283-42b2-9089-b0bca2df1b8a" path="/var/lib/kubelet/pods/ea9637eb-b283-42b2-9089-b0bca2df1b8a/volumes" Nov 25 15:35:28 crc kubenswrapper[4965]: I1125 15:35:28.796339 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1d2b24-a9a4-4fde-9a7d-6875a537887f" path="/var/lib/kubelet/pods/fb1d2b24-a9a4-4fde-9a7d-6875a537887f/volumes" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.200729 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.272565 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8d9x\" (UniqueName: \"kubernetes.io/projected/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-kube-api-access-h8d9x\") pod \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.272720 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-bootstrap-combined-ca-bundle\") pod \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.272795 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-inventory\") pod \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.272868 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-ssh-key\") pod \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\" (UID: \"fbce69a8-d42e-498d-bbb8-7d98e9b1790e\") " Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.277995 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-kube-api-access-h8d9x" (OuterVolumeSpecName: "kube-api-access-h8d9x") pod "fbce69a8-d42e-498d-bbb8-7d98e9b1790e" (UID: "fbce69a8-d42e-498d-bbb8-7d98e9b1790e"). InnerVolumeSpecName "kube-api-access-h8d9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.283105 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fbce69a8-d42e-498d-bbb8-7d98e9b1790e" (UID: "fbce69a8-d42e-498d-bbb8-7d98e9b1790e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.299905 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-inventory" (OuterVolumeSpecName: "inventory") pod "fbce69a8-d42e-498d-bbb8-7d98e9b1790e" (UID: "fbce69a8-d42e-498d-bbb8-7d98e9b1790e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.302189 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fbce69a8-d42e-498d-bbb8-7d98e9b1790e" (UID: "fbce69a8-d42e-498d-bbb8-7d98e9b1790e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.375298 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8d9x\" (UniqueName: \"kubernetes.io/projected/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-kube-api-access-h8d9x\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.375338 4965 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.375352 4965 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.375362 4965 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbce69a8-d42e-498d-bbb8-7d98e9b1790e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.793322 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" event={"ID":"fbce69a8-d42e-498d-bbb8-7d98e9b1790e","Type":"ContainerDied","Data":"853580402a00d5ea539736e380549b5f1b3b6fecb9e70e2d2ae0c41835a9353b"} Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.794055 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="853580402a00d5ea539736e380549b5f1b3b6fecb9e70e2d2ae0c41835a9353b" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.793422 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.894076 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn"] Nov 25 15:35:29 crc kubenswrapper[4965]: E1125 15:35:29.894430 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerName="registry-server" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.894447 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerName="registry-server" Nov 25 15:35:29 crc kubenswrapper[4965]: E1125 15:35:29.894457 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerName="extract-utilities" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.894463 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerName="extract-utilities" Nov 25 15:35:29 crc kubenswrapper[4965]: E1125 15:35:29.894484 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbce69a8-d42e-498d-bbb8-7d98e9b1790e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.894492 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbce69a8-d42e-498d-bbb8-7d98e9b1790e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 15:35:29 crc kubenswrapper[4965]: E1125 15:35:29.894501 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerName="extract-content" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.894507 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerName="extract-content" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.894670 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbce69a8-d42e-498d-bbb8-7d98e9b1790e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.894696 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="42408875-e2d8-4537-85c8-aa2f8fe58cc0" containerName="registry-server" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.895365 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.897801 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.899161 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.899202 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.905852 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn"] Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.907706 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x9c2x" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.989123 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5m7\" (UniqueName: \"kubernetes.io/projected/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-kube-api-access-mc5m7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.989225 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:29 crc kubenswrapper[4965]: I1125 15:35:29.989263 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:30 crc kubenswrapper[4965]: I1125 15:35:30.090738 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5m7\" (UniqueName: \"kubernetes.io/projected/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-kube-api-access-mc5m7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:30 crc kubenswrapper[4965]: I1125 15:35:30.091160 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:30 crc kubenswrapper[4965]: I1125 15:35:30.091197 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:30 crc kubenswrapper[4965]: I1125 15:35:30.095678 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:30 crc kubenswrapper[4965]: I1125 15:35:30.096104 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:30 crc kubenswrapper[4965]: I1125 15:35:30.110164 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5m7\" (UniqueName: \"kubernetes.io/projected/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-kube-api-access-mc5m7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:30 crc kubenswrapper[4965]: I1125 15:35:30.290562 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:35:30 crc kubenswrapper[4965]: I1125 15:35:30.795946 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn"] Nov 25 15:35:30 crc kubenswrapper[4965]: I1125 15:35:30.810471 4965 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:35:31 crc kubenswrapper[4965]: I1125 15:35:31.820703 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" event={"ID":"4e699efe-dd84-4db4-aa1c-a47a22e55f5f","Type":"ContainerStarted","Data":"560b80171880bb832c7ff791f13e89106e7a5162f659f5b218ad50f79c000994"} Nov 25 15:35:31 crc kubenswrapper[4965]: I1125 15:35:31.822204 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" event={"ID":"4e699efe-dd84-4db4-aa1c-a47a22e55f5f","Type":"ContainerStarted","Data":"49168fbb024fd93f39e9c7e121689501492295dd4af19db01ee5bb7314725668"} Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.459598 4965 scope.go:117] "RemoveContainer" containerID="80cefc3a27566e1dd1c5c312795fd602136a0cc9094b1586d81dd01cb2d77937" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.490471 4965 scope.go:117] "RemoveContainer" containerID="0d489402ee0d82ef91bca3b81442c53d1028892727879e49e5b32a9c9f7e450d" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.541304 4965 scope.go:117] "RemoveContainer" containerID="c3a5b946b44a637dab4555a17b2766a9c760ec14be5fd4395a0a2479cd6dafaf" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.571487 4965 scope.go:117] "RemoveContainer" containerID="d6b6591730a09518e037ffc330a8fc32d7d615522fbfebf00dba44422303a8fb" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.612791 4965 scope.go:117] "RemoveContainer" containerID="446a0de38926fe665a87a5e88a9915badb47641538bd91c541808557d625bb7b" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.664259 4965 scope.go:117] "RemoveContainer" containerID="5cda8458a5634f0c2eb16b9ac703fa498b930b59e5454f2d8ea7320a6deb7dd6" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.692024 4965 scope.go:117] "RemoveContainer" containerID="5bdb486b59ee958aed2ea02dd64e3833a4ae9389982b8559b3f55aac39793306" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.728332 4965 scope.go:117] "RemoveContainer" containerID="1030d7dc06e8080135fd865c891265f319d7a05846ef1304b0e7a6b0668b8f96" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.746623 4965 scope.go:117] "RemoveContainer" containerID="8caadc073e744616463e2f51ef22e4fc57e6b6067ab93eee7ddcb6baaf14875c" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.766608 4965 scope.go:117] "RemoveContainer" containerID="743f5cfd5ad15220384b64b72b0920716e12c76ca301428ad251c152329d78dd" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.794054 4965 scope.go:117] "RemoveContainer" containerID="9fdea94f82c0c0d97362c83c9514d5dbeed0dd6311105b5d146c7d2da5c965a6" Nov 25 15:35:33 crc kubenswrapper[4965]: I1125 15:35:33.816757 4965 scope.go:117] "RemoveContainer" containerID="9baa1a982883b23ace0afeafe8e26054c0e2072ed06492309cd844484d0f5401" Nov 25 15:35:36 crc kubenswrapper[4965]: I1125 15:35:36.777620 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:35:37 crc kubenswrapper[4965]: I1125 15:35:37.902214 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"a78849570ebdec0ecb7f8223886210e662c725014f30c20e18e315182e077c80"} Nov 25 15:35:37 crc kubenswrapper[4965]: I1125 15:35:37.930324 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" podStartSLOduration=8.487984302 podStartE2EDuration="8.930297969s" podCreationTimestamp="2025-11-25 15:35:29 +0000 UTC" firstStartedPulling="2025-11-25 15:35:30.810244819 +0000 UTC m=+1875.777838565" lastFinishedPulling="2025-11-25 15:35:31.252558486 +0000 UTC m=+1876.220152232" observedRunningTime="2025-11-25 15:35:31.839209238 +0000 UTC m=+1876.806802994" watchObservedRunningTime="2025-11-25 15:35:37.930297969 +0000 UTC m=+1882.897891725" Nov 25 15:35:54 crc kubenswrapper[4965]: I1125 15:35:54.045333 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qx2h5"] Nov 25 15:35:54 crc kubenswrapper[4965]: I1125 15:35:54.064452 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qx2h5"] Nov 25 15:35:54 crc kubenswrapper[4965]: I1125 15:35:54.791754 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fc256e5-efae-4f80-bcb9-c196c1223ac0" path="/var/lib/kubelet/pods/2fc256e5-efae-4f80-bcb9-c196c1223ac0/volumes" Nov 25 15:36:32 crc kubenswrapper[4965]: I1125 15:36:32.058723 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wbvvf"] Nov 25 15:36:32 crc kubenswrapper[4965]: I1125 15:36:32.074064 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wbvvf"] Nov 25 15:36:32 crc kubenswrapper[4965]: I1125 15:36:32.789791 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d6c94f-3d83-4521-ac18-3eda81279450" path="/var/lib/kubelet/pods/c9d6c94f-3d83-4521-ac18-3eda81279450/volumes" Nov 25 15:36:34 crc kubenswrapper[4965]: I1125 15:36:34.049207 4965 scope.go:117] "RemoveContainer" containerID="edcb1442d96aa44a6029eaa6061be74690fb6b55eed1580484bb644cb9bdbe0a" Nov 25 15:36:34 crc kubenswrapper[4965]: I1125 15:36:34.088334 4965 scope.go:117] "RemoveContainer" containerID="d7a5f784131138dab7cfff054577351056b9ac1c10c088ec78d4673b33c01d4a" Nov 25 15:36:34 crc kubenswrapper[4965]: I1125 15:36:34.125255 4965 scope.go:117] "RemoveContainer" containerID="f75200b06789685d958651e45ef85806bce17e59eb65311be59df2487b8777b7" Nov 25 15:36:34 crc kubenswrapper[4965]: I1125 15:36:34.159370 4965 scope.go:117] "RemoveContainer" containerID="a8437bc5dd4abecb7bed19b9f36ecad826757955e7637392785a475e2dfdeeb7" Nov 25 15:36:45 crc kubenswrapper[4965]: I1125 15:36:45.063834 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-gpzgb"] Nov 25 15:36:45 crc kubenswrapper[4965]: I1125 15:36:45.076025 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-gpzgb"] Nov 25 15:36:46 crc kubenswrapper[4965]: I1125 15:36:46.035626 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rg5sn"] Nov 25 15:36:46 crc kubenswrapper[4965]: I1125 15:36:46.043498 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rg5sn"] Nov 25 15:36:46 crc kubenswrapper[4965]: I1125 15:36:46.789278 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00430c42-5b8c-45e7-97d3-c4c256468678" path="/var/lib/kubelet/pods/00430c42-5b8c-45e7-97d3-c4c256468678/volumes" Nov 25 15:36:46 crc kubenswrapper[4965]: I1125 15:36:46.790104 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff456420-44c5-4945-925c-2f36ae44aad3" path="/var/lib/kubelet/pods/ff456420-44c5-4945-925c-2f36ae44aad3/volumes" Nov 25 15:36:48 crc kubenswrapper[4965]: I1125 15:36:48.028861 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sj59d"] Nov 25 15:36:48 crc kubenswrapper[4965]: I1125 15:36:48.035948 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sj59d"] Nov 25 15:36:48 crc kubenswrapper[4965]: I1125 15:36:48.785387 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aac4c8e-9a2b-4268-b9c9-c2920b585f64" path="/var/lib/kubelet/pods/3aac4c8e-9a2b-4268-b9c9-c2920b585f64/volumes" Nov 25 15:36:52 crc kubenswrapper[4965]: I1125 15:36:52.620505 4965 generic.go:334] "Generic (PLEG): container finished" podID="4e699efe-dd84-4db4-aa1c-a47a22e55f5f" containerID="560b80171880bb832c7ff791f13e89106e7a5162f659f5b218ad50f79c000994" exitCode=0 Nov 25 15:36:52 crc kubenswrapper[4965]: I1125 15:36:52.620582 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" event={"ID":"4e699efe-dd84-4db4-aa1c-a47a22e55f5f","Type":"ContainerDied","Data":"560b80171880bb832c7ff791f13e89106e7a5162f659f5b218ad50f79c000994"} Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.125061 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.194092 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc5m7\" (UniqueName: \"kubernetes.io/projected/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-kube-api-access-mc5m7\") pod \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.194133 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-ssh-key\") pod \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.194262 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-inventory\") pod \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\" (UID: \"4e699efe-dd84-4db4-aa1c-a47a22e55f5f\") " Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.199783 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-kube-api-access-mc5m7" (OuterVolumeSpecName: "kube-api-access-mc5m7") pod "4e699efe-dd84-4db4-aa1c-a47a22e55f5f" (UID: "4e699efe-dd84-4db4-aa1c-a47a22e55f5f"). InnerVolumeSpecName "kube-api-access-mc5m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.224255 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-inventory" (OuterVolumeSpecName: "inventory") pod "4e699efe-dd84-4db4-aa1c-a47a22e55f5f" (UID: "4e699efe-dd84-4db4-aa1c-a47a22e55f5f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.224645 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e699efe-dd84-4db4-aa1c-a47a22e55f5f" (UID: "4e699efe-dd84-4db4-aa1c-a47a22e55f5f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.297074 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc5m7\" (UniqueName: \"kubernetes.io/projected/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-kube-api-access-mc5m7\") on node \"crc\" DevicePath \"\"" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.297631 4965 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.298222 4965 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e699efe-dd84-4db4-aa1c-a47a22e55f5f-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.641179 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" event={"ID":"4e699efe-dd84-4db4-aa1c-a47a22e55f5f","Type":"ContainerDied","Data":"49168fbb024fd93f39e9c7e121689501492295dd4af19db01ee5bb7314725668"} Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.641241 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49168fbb024fd93f39e9c7e121689501492295dd4af19db01ee5bb7314725668" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.641320 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.746952 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw"] Nov 25 15:36:54 crc kubenswrapper[4965]: E1125 15:36:54.747467 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e699efe-dd84-4db4-aa1c-a47a22e55f5f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.747503 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e699efe-dd84-4db4-aa1c-a47a22e55f5f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.747745 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e699efe-dd84-4db4-aa1c-a47a22e55f5f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.748497 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.751192 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.755175 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.755614 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x9c2x" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.755850 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.765858 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw"] Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.807217 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkffw\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.807331 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkffw\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.807658 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdtbh\" (UniqueName: \"kubernetes.io/projected/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-kube-api-access-tdtbh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkffw\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.909340 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkffw\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.909444 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkffw\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.909524 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdtbh\" (UniqueName: \"kubernetes.io/projected/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-kube-api-access-tdtbh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkffw\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.915345 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkffw\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.923683 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkffw\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:54 crc kubenswrapper[4965]: I1125 15:36:54.928990 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdtbh\" (UniqueName: \"kubernetes.io/projected/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-kube-api-access-tdtbh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkffw\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:55 crc kubenswrapper[4965]: I1125 15:36:55.084710 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:36:55 crc kubenswrapper[4965]: I1125 15:36:55.662680 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw"] Nov 25 15:36:56 crc kubenswrapper[4965]: I1125 15:36:56.661816 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" event={"ID":"64f9c1f6-1dd0-4259-962c-637af6b3cf0f","Type":"ContainerStarted","Data":"1648fd33cbc2005488a768b65bab7ed76cb75855f41caaebf212cfb8a51676f2"} Nov 25 15:36:56 crc kubenswrapper[4965]: I1125 15:36:56.661944 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" event={"ID":"64f9c1f6-1dd0-4259-962c-637af6b3cf0f","Type":"ContainerStarted","Data":"7d918170c14cf3d45e6c908fe7284b5c41ad7737bb1c122979bdcc3e255ea83d"} Nov 25 15:36:56 crc kubenswrapper[4965]: I1125 15:36:56.688580 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" podStartSLOduration=2.286453617 podStartE2EDuration="2.688558361s" podCreationTimestamp="2025-11-25 15:36:54 +0000 UTC" firstStartedPulling="2025-11-25 15:36:55.66766886 +0000 UTC m=+1960.635262606" lastFinishedPulling="2025-11-25 15:36:56.069773594 +0000 UTC m=+1961.037367350" observedRunningTime="2025-11-25 15:36:56.678416934 +0000 UTC m=+1961.646010680" watchObservedRunningTime="2025-11-25 15:36:56.688558361 +0000 UTC m=+1961.656152107" Nov 25 15:37:01 crc kubenswrapper[4965]: I1125 15:37:01.711026 4965 generic.go:334] "Generic (PLEG): container finished" podID="64f9c1f6-1dd0-4259-962c-637af6b3cf0f" containerID="1648fd33cbc2005488a768b65bab7ed76cb75855f41caaebf212cfb8a51676f2" exitCode=0 Nov 25 15:37:01 crc kubenswrapper[4965]: I1125 15:37:01.711195 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" event={"ID":"64f9c1f6-1dd0-4259-962c-637af6b3cf0f","Type":"ContainerDied","Data":"1648fd33cbc2005488a768b65bab7ed76cb75855f41caaebf212cfb8a51676f2"} Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.144547 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.312756 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-ssh-key\") pod \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.312851 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdtbh\" (UniqueName: \"kubernetes.io/projected/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-kube-api-access-tdtbh\") pod \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.312883 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-inventory\") pod \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\" (UID: \"64f9c1f6-1dd0-4259-962c-637af6b3cf0f\") " Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.332227 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-kube-api-access-tdtbh" (OuterVolumeSpecName: "kube-api-access-tdtbh") pod "64f9c1f6-1dd0-4259-962c-637af6b3cf0f" (UID: "64f9c1f6-1dd0-4259-962c-637af6b3cf0f"). InnerVolumeSpecName "kube-api-access-tdtbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.337715 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "64f9c1f6-1dd0-4259-962c-637af6b3cf0f" (UID: "64f9c1f6-1dd0-4259-962c-637af6b3cf0f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.338132 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-inventory" (OuterVolumeSpecName: "inventory") pod "64f9c1f6-1dd0-4259-962c-637af6b3cf0f" (UID: "64f9c1f6-1dd0-4259-962c-637af6b3cf0f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.415318 4965 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.415350 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdtbh\" (UniqueName: \"kubernetes.io/projected/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-kube-api-access-tdtbh\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.415364 4965 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64f9c1f6-1dd0-4259-962c-637af6b3cf0f-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.733400 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" event={"ID":"64f9c1f6-1dd0-4259-962c-637af6b3cf0f","Type":"ContainerDied","Data":"7d918170c14cf3d45e6c908fe7284b5c41ad7737bb1c122979bdcc3e255ea83d"} Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.733434 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d918170c14cf3d45e6c908fe7284b5c41ad7737bb1c122979bdcc3e255ea83d" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.733846 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkffw" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.822857 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x"] Nov 25 15:37:03 crc kubenswrapper[4965]: E1125 15:37:03.823527 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f9c1f6-1dd0-4259-962c-637af6b3cf0f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.823627 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f9c1f6-1dd0-4259-962c-637af6b3cf0f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.823888 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f9c1f6-1dd0-4259-962c-637af6b3cf0f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.824487 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.826274 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.826474 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x9c2x" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.834887 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.835057 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.838453 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x"] Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.922012 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hpnb\" (UniqueName: \"kubernetes.io/projected/71388e9b-27dd-4af7-aa67-4330d051a98d-kube-api-access-5hpnb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-trs8x\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.922175 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-trs8x\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:03 crc kubenswrapper[4965]: I1125 15:37:03.922296 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-trs8x\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:04 crc kubenswrapper[4965]: I1125 15:37:04.024104 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-trs8x\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:04 crc kubenswrapper[4965]: I1125 15:37:04.024986 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hpnb\" (UniqueName: \"kubernetes.io/projected/71388e9b-27dd-4af7-aa67-4330d051a98d-kube-api-access-5hpnb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-trs8x\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:04 crc kubenswrapper[4965]: I1125 15:37:04.025201 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-trs8x\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:04 crc kubenswrapper[4965]: I1125 15:37:04.029245 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-trs8x\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:04 crc kubenswrapper[4965]: I1125 15:37:04.030189 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-trs8x\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:04 crc kubenswrapper[4965]: I1125 15:37:04.040447 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hpnb\" (UniqueName: \"kubernetes.io/projected/71388e9b-27dd-4af7-aa67-4330d051a98d-kube-api-access-5hpnb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-trs8x\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:04 crc kubenswrapper[4965]: I1125 15:37:04.154620 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:04 crc kubenswrapper[4965]: I1125 15:37:04.499102 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x"] Nov 25 15:37:04 crc kubenswrapper[4965]: I1125 15:37:04.741547 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" event={"ID":"71388e9b-27dd-4af7-aa67-4330d051a98d","Type":"ContainerStarted","Data":"3b77c4b535e3948ae1ea3ab6173a7fc491e4f60d8a6beec8721d17f0c4339e04"} Nov 25 15:37:05 crc kubenswrapper[4965]: I1125 15:37:05.750558 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" event={"ID":"71388e9b-27dd-4af7-aa67-4330d051a98d","Type":"ContainerStarted","Data":"bf0af57838796b7335e5505c6c7d5f94b36e5f7932524db8a91753bb89d9918b"} Nov 25 15:37:05 crc kubenswrapper[4965]: I1125 15:37:05.768873 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" podStartSLOduration=1.9551837380000001 podStartE2EDuration="2.768853164s" podCreationTimestamp="2025-11-25 15:37:03 +0000 UTC" firstStartedPulling="2025-11-25 15:37:04.503710772 +0000 UTC m=+1969.471304508" lastFinishedPulling="2025-11-25 15:37:05.317380188 +0000 UTC m=+1970.284973934" observedRunningTime="2025-11-25 15:37:05.763364365 +0000 UTC m=+1970.730958111" watchObservedRunningTime="2025-11-25 15:37:05.768853164 +0000 UTC m=+1970.736446920" Nov 25 15:37:16 crc kubenswrapper[4965]: I1125 15:37:16.042107 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-lmtqd"] Nov 25 15:37:16 crc kubenswrapper[4965]: I1125 15:37:16.049267 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-lmtqd"] Nov 25 15:37:16 crc kubenswrapper[4965]: I1125 15:37:16.785187 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78768f1b-d9d5-4124-8fe5-bc4b357605ca" path="/var/lib/kubelet/pods/78768f1b-d9d5-4124-8fe5-bc4b357605ca/volumes" Nov 25 15:37:24 crc kubenswrapper[4965]: I1125 15:37:24.039568 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-cr787"] Nov 25 15:37:24 crc kubenswrapper[4965]: I1125 15:37:24.054328 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-cr787"] Nov 25 15:37:24 crc kubenswrapper[4965]: I1125 15:37:24.784588 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0546399d-f1ee-4fe8-aa16-fb64e9f58899" path="/var/lib/kubelet/pods/0546399d-f1ee-4fe8-aa16-fb64e9f58899/volumes" Nov 25 15:37:34 crc kubenswrapper[4965]: I1125 15:37:34.257505 4965 scope.go:117] "RemoveContainer" containerID="e53e124612fc63f48237b9a3c25ef382c79922eea0f8f23723f10bfddfd2698d" Nov 25 15:37:34 crc kubenswrapper[4965]: I1125 15:37:34.302474 4965 scope.go:117] "RemoveContainer" containerID="a1966663a4a8faa990a6cfe8007baafbbf17a1beca1be322056816ffbc5a8efd" Nov 25 15:37:34 crc kubenswrapper[4965]: I1125 15:37:34.401173 4965 scope.go:117] "RemoveContainer" containerID="a7c8975209991a5252c93faab063aef4580adffd72d12bc295805db554186c6d" Nov 25 15:37:34 crc kubenswrapper[4965]: I1125 15:37:34.440450 4965 scope.go:117] "RemoveContainer" containerID="7d10150452146b84ff4b8b901f36e5add7cc84ac487cefe12062b670af70c4ef" Nov 25 15:37:34 crc kubenswrapper[4965]: I1125 15:37:34.485315 4965 scope.go:117] "RemoveContainer" containerID="b8e3551fd346e2a391d92ef38d4799350aa5efcc2cd58d2fc6285cef38ebf8e0" Nov 25 15:37:50 crc kubenswrapper[4965]: I1125 15:37:50.154004 4965 generic.go:334] "Generic (PLEG): container finished" podID="71388e9b-27dd-4af7-aa67-4330d051a98d" containerID="bf0af57838796b7335e5505c6c7d5f94b36e5f7932524db8a91753bb89d9918b" exitCode=0 Nov 25 15:37:50 crc kubenswrapper[4965]: I1125 15:37:50.154039 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" event={"ID":"71388e9b-27dd-4af7-aa67-4330d051a98d","Type":"ContainerDied","Data":"bf0af57838796b7335e5505c6c7d5f94b36e5f7932524db8a91753bb89d9918b"} Nov 25 15:37:51 crc kubenswrapper[4965]: I1125 15:37:51.655496 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:51 crc kubenswrapper[4965]: I1125 15:37:51.703834 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-ssh-key\") pod \"71388e9b-27dd-4af7-aa67-4330d051a98d\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " Nov 25 15:37:51 crc kubenswrapper[4965]: I1125 15:37:51.703939 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-inventory\") pod \"71388e9b-27dd-4af7-aa67-4330d051a98d\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " Nov 25 15:37:51 crc kubenswrapper[4965]: I1125 15:37:51.704024 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hpnb\" (UniqueName: \"kubernetes.io/projected/71388e9b-27dd-4af7-aa67-4330d051a98d-kube-api-access-5hpnb\") pod \"71388e9b-27dd-4af7-aa67-4330d051a98d\" (UID: \"71388e9b-27dd-4af7-aa67-4330d051a98d\") " Nov 25 15:37:51 crc kubenswrapper[4965]: I1125 15:37:51.712790 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71388e9b-27dd-4af7-aa67-4330d051a98d-kube-api-access-5hpnb" (OuterVolumeSpecName: "kube-api-access-5hpnb") pod "71388e9b-27dd-4af7-aa67-4330d051a98d" (UID: "71388e9b-27dd-4af7-aa67-4330d051a98d"). InnerVolumeSpecName "kube-api-access-5hpnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:37:51 crc kubenswrapper[4965]: I1125 15:37:51.732114 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "71388e9b-27dd-4af7-aa67-4330d051a98d" (UID: "71388e9b-27dd-4af7-aa67-4330d051a98d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:37:51 crc kubenswrapper[4965]: I1125 15:37:51.734047 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-inventory" (OuterVolumeSpecName: "inventory") pod "71388e9b-27dd-4af7-aa67-4330d051a98d" (UID: "71388e9b-27dd-4af7-aa67-4330d051a98d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:37:51 crc kubenswrapper[4965]: I1125 15:37:51.806409 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hpnb\" (UniqueName: \"kubernetes.io/projected/71388e9b-27dd-4af7-aa67-4330d051a98d-kube-api-access-5hpnb\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:51 crc kubenswrapper[4965]: I1125 15:37:51.806444 4965 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:51 crc kubenswrapper[4965]: I1125 15:37:51.806456 4965 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71388e9b-27dd-4af7-aa67-4330d051a98d-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.177739 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" event={"ID":"71388e9b-27dd-4af7-aa67-4330d051a98d","Type":"ContainerDied","Data":"3b77c4b535e3948ae1ea3ab6173a7fc491e4f60d8a6beec8721d17f0c4339e04"} Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.177775 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b77c4b535e3948ae1ea3ab6173a7fc491e4f60d8a6beec8721d17f0c4339e04" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.177817 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-trs8x" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.259947 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52"] Nov 25 15:37:52 crc kubenswrapper[4965]: E1125 15:37:52.260291 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71388e9b-27dd-4af7-aa67-4330d051a98d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.260308 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="71388e9b-27dd-4af7-aa67-4330d051a98d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.260481 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="71388e9b-27dd-4af7-aa67-4330d051a98d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.261006 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.264328 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.268097 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.268183 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x9c2x" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.270071 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.286645 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52"] Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.319488 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.320121 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.320222 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txqkk\" (UniqueName: \"kubernetes.io/projected/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-kube-api-access-txqkk\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.421980 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.422065 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txqkk\" (UniqueName: \"kubernetes.io/projected/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-kube-api-access-txqkk\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.422108 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.434240 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.441234 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.453729 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txqkk\" (UniqueName: \"kubernetes.io/projected/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-kube-api-access-txqkk\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:52 crc kubenswrapper[4965]: I1125 15:37:52.582041 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:37:53 crc kubenswrapper[4965]: I1125 15:37:53.093150 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52"] Nov 25 15:37:53 crc kubenswrapper[4965]: I1125 15:37:53.190358 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" event={"ID":"756d59ef-4e5c-4a8c-8295-67ecbb70e81e","Type":"ContainerStarted","Data":"e7859a9f904ac783a2b133a7c959858f013390ca98d3d13c6a10ba5d972fe826"} Nov 25 15:37:53 crc kubenswrapper[4965]: I1125 15:37:53.260389 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:37:53 crc kubenswrapper[4965]: I1125 15:37:53.260455 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:37:54 crc kubenswrapper[4965]: I1125 15:37:54.059550 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4f01-account-create-6rdbh"] Nov 25 15:37:54 crc kubenswrapper[4965]: I1125 15:37:54.068434 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4f01-account-create-6rdbh"] Nov 25 15:37:54 crc kubenswrapper[4965]: I1125 15:37:54.799618 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2b31744-51a0-4e7e-8239-0ca59000796e" path="/var/lib/kubelet/pods/b2b31744-51a0-4e7e-8239-0ca59000796e/volumes" Nov 25 15:37:55 crc kubenswrapper[4965]: I1125 15:37:55.214237 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" event={"ID":"756d59ef-4e5c-4a8c-8295-67ecbb70e81e","Type":"ContainerStarted","Data":"5e03da7d736ff84bbafddae85b9f66c12d8f655d2d600b2994f9bf8da17789ef"} Nov 25 15:37:55 crc kubenswrapper[4965]: I1125 15:37:55.238244 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" podStartSLOduration=1.935036315 podStartE2EDuration="3.238219851s" podCreationTimestamp="2025-11-25 15:37:52 +0000 UTC" firstStartedPulling="2025-11-25 15:37:53.094675894 +0000 UTC m=+2018.062269640" lastFinishedPulling="2025-11-25 15:37:54.39785942 +0000 UTC m=+2019.365453176" observedRunningTime="2025-11-25 15:37:55.237881652 +0000 UTC m=+2020.205475408" watchObservedRunningTime="2025-11-25 15:37:55.238219851 +0000 UTC m=+2020.205813617" Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.033792 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-cszg8"] Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.043228 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-cszg8"] Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.052884 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f6d8-account-create-mfxnx"] Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.063133 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mlzsx"] Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.072915 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f6d8-account-create-mfxnx"] Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.080774 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-pjhn7"] Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.089402 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mlzsx"] Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.096182 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ea7f-account-create-m7smz"] Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.102393 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ea7f-account-create-m7smz"] Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.108980 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-pjhn7"] Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.792419 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43fd6c78-af14-40ad-9fca-ce8a2d0370ba" path="/var/lib/kubelet/pods/43fd6c78-af14-40ad-9fca-ce8a2d0370ba/volumes" Nov 25 15:37:56 crc kubenswrapper[4965]: I1125 15:37:56.793046 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="443303d8-fe0f-4af1-8ce8-0e99085d5e49" path="/var/lib/kubelet/pods/443303d8-fe0f-4af1-8ce8-0e99085d5e49/volumes" Nov 25 15:37:57 crc kubenswrapper[4965]: I1125 15:37:57.065923 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668762c3-71e3-42b7-974e-734b02cdbc1c" path="/var/lib/kubelet/pods/668762c3-71e3-42b7-974e-734b02cdbc1c/volumes" Nov 25 15:37:57 crc kubenswrapper[4965]: I1125 15:37:57.067477 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada44be1-c488-4ad3-bee0-fe9c0d7b84af" path="/var/lib/kubelet/pods/ada44be1-c488-4ad3-bee0-fe9c0d7b84af/volumes" Nov 25 15:37:57 crc kubenswrapper[4965]: I1125 15:37:57.068502 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a5305f-d6dc-4bf3-b356-4d278731672d" path="/var/lib/kubelet/pods/f6a5305f-d6dc-4bf3-b356-4d278731672d/volumes" Nov 25 15:37:59 crc kubenswrapper[4965]: I1125 15:37:59.244701 4965 generic.go:334] "Generic (PLEG): container finished" podID="756d59ef-4e5c-4a8c-8295-67ecbb70e81e" containerID="5e03da7d736ff84bbafddae85b9f66c12d8f655d2d600b2994f9bf8da17789ef" exitCode=0 Nov 25 15:37:59 crc kubenswrapper[4965]: I1125 15:37:59.244794 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" event={"ID":"756d59ef-4e5c-4a8c-8295-67ecbb70e81e","Type":"ContainerDied","Data":"5e03da7d736ff84bbafddae85b9f66c12d8f655d2d600b2994f9bf8da17789ef"} Nov 25 15:38:00 crc kubenswrapper[4965]: I1125 15:38:00.636389 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:38:00 crc kubenswrapper[4965]: I1125 15:38:00.824661 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txqkk\" (UniqueName: \"kubernetes.io/projected/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-kube-api-access-txqkk\") pod \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " Nov 25 15:38:00 crc kubenswrapper[4965]: I1125 15:38:00.825126 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-ssh-key\") pod \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " Nov 25 15:38:00 crc kubenswrapper[4965]: I1125 15:38:00.825223 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-inventory\") pod \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\" (UID: \"756d59ef-4e5c-4a8c-8295-67ecbb70e81e\") " Nov 25 15:38:00 crc kubenswrapper[4965]: I1125 15:38:00.829999 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-kube-api-access-txqkk" (OuterVolumeSpecName: "kube-api-access-txqkk") pod "756d59ef-4e5c-4a8c-8295-67ecbb70e81e" (UID: "756d59ef-4e5c-4a8c-8295-67ecbb70e81e"). InnerVolumeSpecName "kube-api-access-txqkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:38:00 crc kubenswrapper[4965]: I1125 15:38:00.853345 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-inventory" (OuterVolumeSpecName: "inventory") pod "756d59ef-4e5c-4a8c-8295-67ecbb70e81e" (UID: "756d59ef-4e5c-4a8c-8295-67ecbb70e81e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:38:00 crc kubenswrapper[4965]: I1125 15:38:00.855196 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "756d59ef-4e5c-4a8c-8295-67ecbb70e81e" (UID: "756d59ef-4e5c-4a8c-8295-67ecbb70e81e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:38:00 crc kubenswrapper[4965]: I1125 15:38:00.927113 4965 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 15:38:00 crc kubenswrapper[4965]: I1125 15:38:00.927141 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txqkk\" (UniqueName: \"kubernetes.io/projected/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-kube-api-access-txqkk\") on node \"crc\" DevicePath \"\"" Nov 25 15:38:00 crc kubenswrapper[4965]: I1125 15:38:00.927152 4965 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/756d59ef-4e5c-4a8c-8295-67ecbb70e81e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.259654 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" event={"ID":"756d59ef-4e5c-4a8c-8295-67ecbb70e81e","Type":"ContainerDied","Data":"e7859a9f904ac783a2b133a7c959858f013390ca98d3d13c6a10ba5d972fe826"} Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.259693 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7859a9f904ac783a2b133a7c959858f013390ca98d3d13c6a10ba5d972fe826" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.259737 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.349918 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk"] Nov 25 15:38:01 crc kubenswrapper[4965]: E1125 15:38:01.350425 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756d59ef-4e5c-4a8c-8295-67ecbb70e81e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.350450 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="756d59ef-4e5c-4a8c-8295-67ecbb70e81e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.350720 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="756d59ef-4e5c-4a8c-8295-67ecbb70e81e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.351440 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.355312 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.355591 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.355764 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x9c2x" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.356016 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.358956 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk"] Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.544725 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8nbq\" (UniqueName: \"kubernetes.io/projected/ec41d4aa-6d2e-4068-82a9-18fd1489899b-kube-api-access-x8nbq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.545546 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.545642 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.648117 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.648272 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.648499 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8nbq\" (UniqueName: \"kubernetes.io/projected/ec41d4aa-6d2e-4068-82a9-18fd1489899b-kube-api-access-x8nbq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.655735 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.656722 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.674645 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8nbq\" (UniqueName: \"kubernetes.io/projected/ec41d4aa-6d2e-4068-82a9-18fd1489899b-kube-api-access-x8nbq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:01 crc kubenswrapper[4965]: I1125 15:38:01.974609 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:38:02 crc kubenswrapper[4965]: I1125 15:38:02.520621 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk"] Nov 25 15:38:03 crc kubenswrapper[4965]: I1125 15:38:03.275638 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" event={"ID":"ec41d4aa-6d2e-4068-82a9-18fd1489899b","Type":"ContainerStarted","Data":"8b04ab748bdbc718bd87b00b04c9d8da1f2c6da2a8e84a8a07e41b36f1801dce"} Nov 25 15:38:04 crc kubenswrapper[4965]: I1125 15:38:04.284189 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" event={"ID":"ec41d4aa-6d2e-4068-82a9-18fd1489899b","Type":"ContainerStarted","Data":"a6d204cf7c0c919b660919bfa0139e4ddbf854790f543a1065f5875f5a1ae4ef"} Nov 25 15:38:04 crc kubenswrapper[4965]: I1125 15:38:04.303322 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" podStartSLOduration=2.536246644 podStartE2EDuration="3.303305682s" podCreationTimestamp="2025-11-25 15:38:01 +0000 UTC" firstStartedPulling="2025-11-25 15:38:02.524326518 +0000 UTC m=+2027.491920264" lastFinishedPulling="2025-11-25 15:38:03.291385556 +0000 UTC m=+2028.258979302" observedRunningTime="2025-11-25 15:38:04.300038254 +0000 UTC m=+2029.267632000" watchObservedRunningTime="2025-11-25 15:38:04.303305682 +0000 UTC m=+2029.270899428" Nov 25 15:38:23 crc kubenswrapper[4965]: I1125 15:38:23.260510 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:38:23 crc kubenswrapper[4965]: I1125 15:38:23.260960 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:38:34 crc kubenswrapper[4965]: I1125 15:38:34.661544 4965 scope.go:117] "RemoveContainer" containerID="2b708fc65e9bad06225fe7eb94f5f0fd7bf4aab88576225fe591407a02fac35f" Nov 25 15:38:34 crc kubenswrapper[4965]: I1125 15:38:34.688491 4965 scope.go:117] "RemoveContainer" containerID="fe86c3c0a2e0e7d0fa27ce05e8e36e30f97266e6cdfb280f738d4b22436f4287" Nov 25 15:38:34 crc kubenswrapper[4965]: I1125 15:38:34.765063 4965 scope.go:117] "RemoveContainer" containerID="e55b868f3f91ca9a8d6da8bb998e9fa66ea93128ac3fa1b050ba26a8a75a169f" Nov 25 15:38:34 crc kubenswrapper[4965]: I1125 15:38:34.848659 4965 scope.go:117] "RemoveContainer" containerID="2882683b376024b3a00294d67797ac5fca63691b16ef26abf8edb6e8df7cd87e" Nov 25 15:38:34 crc kubenswrapper[4965]: I1125 15:38:34.875794 4965 scope.go:117] "RemoveContainer" containerID="7c41b069cd66df87c972997bcbd7e6b850c5701c808a405b9762dac3f41e4caa" Nov 25 15:38:34 crc kubenswrapper[4965]: I1125 15:38:34.911266 4965 scope.go:117] "RemoveContainer" containerID="5947467abd4d280588c55bd5f26d68615c4ba3e0d5be62847489a19610b9b689" Nov 25 15:38:48 crc kubenswrapper[4965]: I1125 15:38:48.877459 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8xzhd"] Nov 25 15:38:48 crc kubenswrapper[4965]: I1125 15:38:48.880168 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:48 crc kubenswrapper[4965]: I1125 15:38:48.895282 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8xzhd"] Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.030231 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l52h\" (UniqueName: \"kubernetes.io/projected/82f473c0-fa95-4926-ae9d-e1290ca6265e-kube-api-access-6l52h\") pod \"redhat-operators-8xzhd\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.030998 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-catalog-content\") pod \"redhat-operators-8xzhd\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.031098 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-utilities\") pod \"redhat-operators-8xzhd\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.132314 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-utilities\") pod \"redhat-operators-8xzhd\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.132389 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l52h\" (UniqueName: \"kubernetes.io/projected/82f473c0-fa95-4926-ae9d-e1290ca6265e-kube-api-access-6l52h\") pod \"redhat-operators-8xzhd\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.132489 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-catalog-content\") pod \"redhat-operators-8xzhd\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.132754 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-utilities\") pod \"redhat-operators-8xzhd\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.132897 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-catalog-content\") pod \"redhat-operators-8xzhd\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.152382 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l52h\" (UniqueName: \"kubernetes.io/projected/82f473c0-fa95-4926-ae9d-e1290ca6265e-kube-api-access-6l52h\") pod \"redhat-operators-8xzhd\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.204106 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.666019 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8xzhd"] Nov 25 15:38:49 crc kubenswrapper[4965]: I1125 15:38:49.793426 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8xzhd" event={"ID":"82f473c0-fa95-4926-ae9d-e1290ca6265e","Type":"ContainerStarted","Data":"63ff29b7470eb9bb8d45aa8dd7a27b3fccb4bec4df42cbe0a5a9c84512a4b49f"} Nov 25 15:38:50 crc kubenswrapper[4965]: I1125 15:38:50.803951 4965 generic.go:334] "Generic (PLEG): container finished" podID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerID="adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306" exitCode=0 Nov 25 15:38:50 crc kubenswrapper[4965]: I1125 15:38:50.804136 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8xzhd" event={"ID":"82f473c0-fa95-4926-ae9d-e1290ca6265e","Type":"ContainerDied","Data":"adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306"} Nov 25 15:38:51 crc kubenswrapper[4965]: I1125 15:38:51.815908 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8xzhd" event={"ID":"82f473c0-fa95-4926-ae9d-e1290ca6265e","Type":"ContainerStarted","Data":"a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc"} Nov 25 15:38:53 crc kubenswrapper[4965]: I1125 15:38:53.260101 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:38:53 crc kubenswrapper[4965]: I1125 15:38:53.260520 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:38:53 crc kubenswrapper[4965]: I1125 15:38:53.260576 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:38:53 crc kubenswrapper[4965]: I1125 15:38:53.261328 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a78849570ebdec0ecb7f8223886210e662c725014f30c20e18e315182e077c80"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:38:53 crc kubenswrapper[4965]: I1125 15:38:53.261405 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://a78849570ebdec0ecb7f8223886210e662c725014f30c20e18e315182e077c80" gracePeriod=600 Nov 25 15:38:53 crc kubenswrapper[4965]: I1125 15:38:53.844172 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="a78849570ebdec0ecb7f8223886210e662c725014f30c20e18e315182e077c80" exitCode=0 Nov 25 15:38:53 crc kubenswrapper[4965]: I1125 15:38:53.844212 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"a78849570ebdec0ecb7f8223886210e662c725014f30c20e18e315182e077c80"} Nov 25 15:38:53 crc kubenswrapper[4965]: I1125 15:38:53.844237 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24"} Nov 25 15:38:53 crc kubenswrapper[4965]: I1125 15:38:53.844253 4965 scope.go:117] "RemoveContainer" containerID="79b7e49cbc52d932189378d522bf95865e49018029bc94caac349304290605ab" Nov 25 15:39:00 crc kubenswrapper[4965]: I1125 15:39:00.917185 4965 generic.go:334] "Generic (PLEG): container finished" podID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerID="a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc" exitCode=0 Nov 25 15:39:00 crc kubenswrapper[4965]: I1125 15:39:00.917793 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8xzhd" event={"ID":"82f473c0-fa95-4926-ae9d-e1290ca6265e","Type":"ContainerDied","Data":"a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc"} Nov 25 15:39:02 crc kubenswrapper[4965]: I1125 15:39:02.941458 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8xzhd" event={"ID":"82f473c0-fa95-4926-ae9d-e1290ca6265e","Type":"ContainerStarted","Data":"6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4"} Nov 25 15:39:02 crc kubenswrapper[4965]: I1125 15:39:02.965253 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8xzhd" podStartSLOduration=3.427505649 podStartE2EDuration="14.965230397s" podCreationTimestamp="2025-11-25 15:38:48 +0000 UTC" firstStartedPulling="2025-11-25 15:38:50.807091379 +0000 UTC m=+2075.774685125" lastFinishedPulling="2025-11-25 15:39:02.344816117 +0000 UTC m=+2087.312409873" observedRunningTime="2025-11-25 15:39:02.960518038 +0000 UTC m=+2087.928111784" watchObservedRunningTime="2025-11-25 15:39:02.965230397 +0000 UTC m=+2087.932824183" Nov 25 15:39:04 crc kubenswrapper[4965]: I1125 15:39:04.958349 4965 generic.go:334] "Generic (PLEG): container finished" podID="ec41d4aa-6d2e-4068-82a9-18fd1489899b" containerID="a6d204cf7c0c919b660919bfa0139e4ddbf854790f543a1065f5875f5a1ae4ef" exitCode=0 Nov 25 15:39:04 crc kubenswrapper[4965]: I1125 15:39:04.958452 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" event={"ID":"ec41d4aa-6d2e-4068-82a9-18fd1489899b","Type":"ContainerDied","Data":"a6d204cf7c0c919b660919bfa0139e4ddbf854790f543a1065f5875f5a1ae4ef"} Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.456915 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.563336 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8nbq\" (UniqueName: \"kubernetes.io/projected/ec41d4aa-6d2e-4068-82a9-18fd1489899b-kube-api-access-x8nbq\") pod \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.564328 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-ssh-key\") pod \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.564409 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-inventory\") pod \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\" (UID: \"ec41d4aa-6d2e-4068-82a9-18fd1489899b\") " Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.580313 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec41d4aa-6d2e-4068-82a9-18fd1489899b-kube-api-access-x8nbq" (OuterVolumeSpecName: "kube-api-access-x8nbq") pod "ec41d4aa-6d2e-4068-82a9-18fd1489899b" (UID: "ec41d4aa-6d2e-4068-82a9-18fd1489899b"). InnerVolumeSpecName "kube-api-access-x8nbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.596649 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ec41d4aa-6d2e-4068-82a9-18fd1489899b" (UID: "ec41d4aa-6d2e-4068-82a9-18fd1489899b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.600124 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-inventory" (OuterVolumeSpecName: "inventory") pod "ec41d4aa-6d2e-4068-82a9-18fd1489899b" (UID: "ec41d4aa-6d2e-4068-82a9-18fd1489899b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.667664 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8nbq\" (UniqueName: \"kubernetes.io/projected/ec41d4aa-6d2e-4068-82a9-18fd1489899b-kube-api-access-x8nbq\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.667705 4965 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.667720 4965 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec41d4aa-6d2e-4068-82a9-18fd1489899b-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.981706 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" event={"ID":"ec41d4aa-6d2e-4068-82a9-18fd1489899b","Type":"ContainerDied","Data":"8b04ab748bdbc718bd87b00b04c9d8da1f2c6da2a8e84a8a07e41b36f1801dce"} Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.981752 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b04ab748bdbc718bd87b00b04c9d8da1f2c6da2a8e84a8a07e41b36f1801dce" Nov 25 15:39:06 crc kubenswrapper[4965]: I1125 15:39:06.981761 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.076247 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sd7b2"] Nov 25 15:39:07 crc kubenswrapper[4965]: E1125 15:39:07.076616 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec41d4aa-6d2e-4068-82a9-18fd1489899b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.076632 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec41d4aa-6d2e-4068-82a9-18fd1489899b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.076822 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec41d4aa-6d2e-4068-82a9-18fd1489899b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.077409 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.080364 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x9c2x" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.080544 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.081548 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.084932 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.092800 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sd7b2"] Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.177859 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8st22\" (UniqueName: \"kubernetes.io/projected/c6c05a6e-b7d5-4906-96b7-713e13170260-kube-api-access-8st22\") pod \"ssh-known-hosts-edpm-deployment-sd7b2\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.177941 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sd7b2\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.178075 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sd7b2\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.280100 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sd7b2\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.280568 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8st22\" (UniqueName: \"kubernetes.io/projected/c6c05a6e-b7d5-4906-96b7-713e13170260-kube-api-access-8st22\") pod \"ssh-known-hosts-edpm-deployment-sd7b2\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.280807 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sd7b2\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.296911 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sd7b2\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.297103 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sd7b2\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.305745 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8st22\" (UniqueName: \"kubernetes.io/projected/c6c05a6e-b7d5-4906-96b7-713e13170260-kube-api-access-8st22\") pod \"ssh-known-hosts-edpm-deployment-sd7b2\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:07 crc kubenswrapper[4965]: I1125 15:39:07.393692 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:08 crc kubenswrapper[4965]: I1125 15:39:08.002021 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sd7b2"] Nov 25 15:39:08 crc kubenswrapper[4965]: W1125 15:39:08.013440 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6c05a6e_b7d5_4906_96b7_713e13170260.slice/crio-4fbbf296735b47ea598dafa593bafbdf86760713ec7e478dd907e8f95341d554 WatchSource:0}: Error finding container 4fbbf296735b47ea598dafa593bafbdf86760713ec7e478dd907e8f95341d554: Status 404 returned error can't find the container with id 4fbbf296735b47ea598dafa593bafbdf86760713ec7e478dd907e8f95341d554 Nov 25 15:39:09 crc kubenswrapper[4965]: I1125 15:39:09.008480 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" event={"ID":"c6c05a6e-b7d5-4906-96b7-713e13170260","Type":"ContainerStarted","Data":"465cca8ab04d429084417656c58e1e169ac8a617db74a69d7811e23867db4410"} Nov 25 15:39:09 crc kubenswrapper[4965]: I1125 15:39:09.008833 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" event={"ID":"c6c05a6e-b7d5-4906-96b7-713e13170260","Type":"ContainerStarted","Data":"4fbbf296735b47ea598dafa593bafbdf86760713ec7e478dd907e8f95341d554"} Nov 25 15:39:09 crc kubenswrapper[4965]: I1125 15:39:09.035010 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" podStartSLOduration=1.5916786539999999 podStartE2EDuration="2.034991248s" podCreationTimestamp="2025-11-25 15:39:07 +0000 UTC" firstStartedPulling="2025-11-25 15:39:08.015976489 +0000 UTC m=+2092.983570235" lastFinishedPulling="2025-11-25 15:39:08.459289083 +0000 UTC m=+2093.426882829" observedRunningTime="2025-11-25 15:39:09.025190841 +0000 UTC m=+2093.992784587" watchObservedRunningTime="2025-11-25 15:39:09.034991248 +0000 UTC m=+2094.002584994" Nov 25 15:39:09 crc kubenswrapper[4965]: I1125 15:39:09.204985 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:39:09 crc kubenswrapper[4965]: I1125 15:39:09.205326 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:39:10 crc kubenswrapper[4965]: I1125 15:39:10.249018 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8xzhd" podUID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerName="registry-server" probeResult="failure" output=< Nov 25 15:39:10 crc kubenswrapper[4965]: timeout: failed to connect service ":50051" within 1s Nov 25 15:39:10 crc kubenswrapper[4965]: > Nov 25 15:39:17 crc kubenswrapper[4965]: I1125 15:39:17.085367 4965 generic.go:334] "Generic (PLEG): container finished" podID="c6c05a6e-b7d5-4906-96b7-713e13170260" containerID="465cca8ab04d429084417656c58e1e169ac8a617db74a69d7811e23867db4410" exitCode=0 Nov 25 15:39:17 crc kubenswrapper[4965]: I1125 15:39:17.085418 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" event={"ID":"c6c05a6e-b7d5-4906-96b7-713e13170260","Type":"ContainerDied","Data":"465cca8ab04d429084417656c58e1e169ac8a617db74a69d7811e23867db4410"} Nov 25 15:39:18 crc kubenswrapper[4965]: I1125 15:39:18.486517 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:18 crc kubenswrapper[4965]: I1125 15:39:18.613468 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-ssh-key-openstack-edpm-ipam\") pod \"c6c05a6e-b7d5-4906-96b7-713e13170260\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " Nov 25 15:39:18 crc kubenswrapper[4965]: I1125 15:39:18.613530 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8st22\" (UniqueName: \"kubernetes.io/projected/c6c05a6e-b7d5-4906-96b7-713e13170260-kube-api-access-8st22\") pod \"c6c05a6e-b7d5-4906-96b7-713e13170260\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " Nov 25 15:39:18 crc kubenswrapper[4965]: I1125 15:39:18.613567 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-inventory-0\") pod \"c6c05a6e-b7d5-4906-96b7-713e13170260\" (UID: \"c6c05a6e-b7d5-4906-96b7-713e13170260\") " Nov 25 15:39:18 crc kubenswrapper[4965]: I1125 15:39:18.624901 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c05a6e-b7d5-4906-96b7-713e13170260-kube-api-access-8st22" (OuterVolumeSpecName: "kube-api-access-8st22") pod "c6c05a6e-b7d5-4906-96b7-713e13170260" (UID: "c6c05a6e-b7d5-4906-96b7-713e13170260"). InnerVolumeSpecName "kube-api-access-8st22". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:39:18 crc kubenswrapper[4965]: I1125 15:39:18.650192 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c6c05a6e-b7d5-4906-96b7-713e13170260" (UID: "c6c05a6e-b7d5-4906-96b7-713e13170260"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:39:18 crc kubenswrapper[4965]: I1125 15:39:18.653550 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c6c05a6e-b7d5-4906-96b7-713e13170260" (UID: "c6c05a6e-b7d5-4906-96b7-713e13170260"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:39:18 crc kubenswrapper[4965]: I1125 15:39:18.716279 4965 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:18 crc kubenswrapper[4965]: I1125 15:39:18.716308 4965 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c6c05a6e-b7d5-4906-96b7-713e13170260-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:18 crc kubenswrapper[4965]: I1125 15:39:18.716318 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8st22\" (UniqueName: \"kubernetes.io/projected/c6c05a6e-b7d5-4906-96b7-713e13170260-kube-api-access-8st22\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.104664 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" event={"ID":"c6c05a6e-b7d5-4906-96b7-713e13170260","Type":"ContainerDied","Data":"4fbbf296735b47ea598dafa593bafbdf86760713ec7e478dd907e8f95341d554"} Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.105087 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fbbf296735b47ea598dafa593bafbdf86760713ec7e478dd907e8f95341d554" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.104803 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sd7b2" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.200551 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m"] Nov 25 15:39:19 crc kubenswrapper[4965]: E1125 15:39:19.201305 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c05a6e-b7d5-4906-96b7-713e13170260" containerName="ssh-known-hosts-edpm-deployment" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.201413 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c05a6e-b7d5-4906-96b7-713e13170260" containerName="ssh-known-hosts-edpm-deployment" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.201701 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c05a6e-b7d5-4906-96b7-713e13170260" containerName="ssh-known-hosts-edpm-deployment" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.202560 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.205583 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.206087 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.206109 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.206208 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x9c2x" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.217851 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m"] Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.275501 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.326656 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9zx8m\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.326714 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9zx8m\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.326734 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s89t\" (UniqueName: \"kubernetes.io/projected/158e33b0-a941-4486-af32-c561ca8d32db-kube-api-access-5s89t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9zx8m\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.338902 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.428795 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9zx8m\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.428865 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9zx8m\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.428905 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s89t\" (UniqueName: \"kubernetes.io/projected/158e33b0-a941-4486-af32-c561ca8d32db-kube-api-access-5s89t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9zx8m\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.434189 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9zx8m\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.444309 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9zx8m\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.453908 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s89t\" (UniqueName: \"kubernetes.io/projected/158e33b0-a941-4486-af32-c561ca8d32db-kube-api-access-5s89t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9zx8m\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.521281 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:19 crc kubenswrapper[4965]: I1125 15:39:19.712587 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8xzhd"] Nov 25 15:39:20 crc kubenswrapper[4965]: I1125 15:39:20.108430 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m"] Nov 25 15:39:21 crc kubenswrapper[4965]: I1125 15:39:21.136469 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8xzhd" podUID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerName="registry-server" containerID="cri-o://6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4" gracePeriod=2 Nov 25 15:39:21 crc kubenswrapper[4965]: I1125 15:39:21.136988 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" event={"ID":"158e33b0-a941-4486-af32-c561ca8d32db","Type":"ContainerStarted","Data":"2416453a8456612e652941772b992b54aa00ddbad42a4c2b421e6ed53fba9ffa"} Nov 25 15:39:21 crc kubenswrapper[4965]: I1125 15:39:21.137027 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" event={"ID":"158e33b0-a941-4486-af32-c561ca8d32db","Type":"ContainerStarted","Data":"8b1172185cb98fbbbf23d23288981c41309c278d2d0bf1d2bb0ce320a66efae3"} Nov 25 15:39:21 crc kubenswrapper[4965]: I1125 15:39:21.165317 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" podStartSLOduration=1.751296081 podStartE2EDuration="2.165302219s" podCreationTimestamp="2025-11-25 15:39:19 +0000 UTC" firstStartedPulling="2025-11-25 15:39:20.12940465 +0000 UTC m=+2105.096998396" lastFinishedPulling="2025-11-25 15:39:20.543410788 +0000 UTC m=+2105.511004534" observedRunningTime="2025-11-25 15:39:21.162445922 +0000 UTC m=+2106.130039668" watchObservedRunningTime="2025-11-25 15:39:21.165302219 +0000 UTC m=+2106.132895965" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.091104 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.159112 4965 generic.go:334] "Generic (PLEG): container finished" podID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerID="6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4" exitCode=0 Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.160138 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8xzhd" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.160672 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8xzhd" event={"ID":"82f473c0-fa95-4926-ae9d-e1290ca6265e","Type":"ContainerDied","Data":"6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4"} Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.160703 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8xzhd" event={"ID":"82f473c0-fa95-4926-ae9d-e1290ca6265e","Type":"ContainerDied","Data":"63ff29b7470eb9bb8d45aa8dd7a27b3fccb4bec4df42cbe0a5a9c84512a4b49f"} Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.160722 4965 scope.go:117] "RemoveContainer" containerID="6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.180551 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l52h\" (UniqueName: \"kubernetes.io/projected/82f473c0-fa95-4926-ae9d-e1290ca6265e-kube-api-access-6l52h\") pod \"82f473c0-fa95-4926-ae9d-e1290ca6265e\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.180722 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-utilities\") pod \"82f473c0-fa95-4926-ae9d-e1290ca6265e\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.180880 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-catalog-content\") pod \"82f473c0-fa95-4926-ae9d-e1290ca6265e\" (UID: \"82f473c0-fa95-4926-ae9d-e1290ca6265e\") " Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.182431 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-utilities" (OuterVolumeSpecName: "utilities") pod "82f473c0-fa95-4926-ae9d-e1290ca6265e" (UID: "82f473c0-fa95-4926-ae9d-e1290ca6265e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.208924 4965 scope.go:117] "RemoveContainer" containerID="a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.210204 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f473c0-fa95-4926-ae9d-e1290ca6265e-kube-api-access-6l52h" (OuterVolumeSpecName: "kube-api-access-6l52h") pod "82f473c0-fa95-4926-ae9d-e1290ca6265e" (UID: "82f473c0-fa95-4926-ae9d-e1290ca6265e"). InnerVolumeSpecName "kube-api-access-6l52h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.246776 4965 scope.go:117] "RemoveContainer" containerID="adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.283179 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l52h\" (UniqueName: \"kubernetes.io/projected/82f473c0-fa95-4926-ae9d-e1290ca6265e-kube-api-access-6l52h\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.283217 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.285833 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82f473c0-fa95-4926-ae9d-e1290ca6265e" (UID: "82f473c0-fa95-4926-ae9d-e1290ca6265e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.290784 4965 scope.go:117] "RemoveContainer" containerID="6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4" Nov 25 15:39:22 crc kubenswrapper[4965]: E1125 15:39:22.291869 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4\": container with ID starting with 6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4 not found: ID does not exist" containerID="6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.291934 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4"} err="failed to get container status \"6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4\": rpc error: code = NotFound desc = could not find container \"6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4\": container with ID starting with 6f1e94631832c17b81c47be46a7498ab3a256f16c20a837d1dcfb79d0b641bf4 not found: ID does not exist" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.291990 4965 scope.go:117] "RemoveContainer" containerID="a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc" Nov 25 15:39:22 crc kubenswrapper[4965]: E1125 15:39:22.292312 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc\": container with ID starting with a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc not found: ID does not exist" containerID="a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.292339 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc"} err="failed to get container status \"a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc\": rpc error: code = NotFound desc = could not find container \"a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc\": container with ID starting with a46085358b7a0e58efc6b349fc86f9dba34b8434eca8fd5e0b7bf3ec12965fbc not found: ID does not exist" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.292358 4965 scope.go:117] "RemoveContainer" containerID="adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306" Nov 25 15:39:22 crc kubenswrapper[4965]: E1125 15:39:22.292579 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306\": container with ID starting with adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306 not found: ID does not exist" containerID="adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.292624 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306"} err="failed to get container status \"adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306\": rpc error: code = NotFound desc = could not find container \"adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306\": container with ID starting with adb925fd702d190c665715867f597b64f4ee8903fffb3e2d369e708854f30306 not found: ID does not exist" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.384874 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f473c0-fa95-4926-ae9d-e1290ca6265e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.494386 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8xzhd"] Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.503536 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8xzhd"] Nov 25 15:39:22 crc kubenswrapper[4965]: I1125 15:39:22.784093 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f473c0-fa95-4926-ae9d-e1290ca6265e" path="/var/lib/kubelet/pods/82f473c0-fa95-4926-ae9d-e1290ca6265e/volumes" Nov 25 15:39:23 crc kubenswrapper[4965]: I1125 15:39:23.050168 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hkzn5"] Nov 25 15:39:23 crc kubenswrapper[4965]: I1125 15:39:23.059268 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hkzn5"] Nov 25 15:39:24 crc kubenswrapper[4965]: I1125 15:39:24.782507 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069" path="/var/lib/kubelet/pods/c4aa1e5c-ad0b-4c70-bba3-d3a2f370d069/volumes" Nov 25 15:39:31 crc kubenswrapper[4965]: I1125 15:39:31.235954 4965 generic.go:334] "Generic (PLEG): container finished" podID="158e33b0-a941-4486-af32-c561ca8d32db" containerID="2416453a8456612e652941772b992b54aa00ddbad42a4c2b421e6ed53fba9ffa" exitCode=0 Nov 25 15:39:31 crc kubenswrapper[4965]: I1125 15:39:31.236023 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" event={"ID":"158e33b0-a941-4486-af32-c561ca8d32db","Type":"ContainerDied","Data":"2416453a8456612e652941772b992b54aa00ddbad42a4c2b421e6ed53fba9ffa"} Nov 25 15:39:32 crc kubenswrapper[4965]: I1125 15:39:32.664020 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:32 crc kubenswrapper[4965]: I1125 15:39:32.764822 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-inventory\") pod \"158e33b0-a941-4486-af32-c561ca8d32db\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " Nov 25 15:39:32 crc kubenswrapper[4965]: I1125 15:39:32.765030 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s89t\" (UniqueName: \"kubernetes.io/projected/158e33b0-a941-4486-af32-c561ca8d32db-kube-api-access-5s89t\") pod \"158e33b0-a941-4486-af32-c561ca8d32db\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " Nov 25 15:39:32 crc kubenswrapper[4965]: I1125 15:39:32.765087 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-ssh-key\") pod \"158e33b0-a941-4486-af32-c561ca8d32db\" (UID: \"158e33b0-a941-4486-af32-c561ca8d32db\") " Nov 25 15:39:32 crc kubenswrapper[4965]: I1125 15:39:32.770397 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158e33b0-a941-4486-af32-c561ca8d32db-kube-api-access-5s89t" (OuterVolumeSpecName: "kube-api-access-5s89t") pod "158e33b0-a941-4486-af32-c561ca8d32db" (UID: "158e33b0-a941-4486-af32-c561ca8d32db"). InnerVolumeSpecName "kube-api-access-5s89t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:39:32 crc kubenswrapper[4965]: I1125 15:39:32.798297 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "158e33b0-a941-4486-af32-c561ca8d32db" (UID: "158e33b0-a941-4486-af32-c561ca8d32db"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:39:32 crc kubenswrapper[4965]: I1125 15:39:32.799364 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-inventory" (OuterVolumeSpecName: "inventory") pod "158e33b0-a941-4486-af32-c561ca8d32db" (UID: "158e33b0-a941-4486-af32-c561ca8d32db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:39:32 crc kubenswrapper[4965]: I1125 15:39:32.867101 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s89t\" (UniqueName: \"kubernetes.io/projected/158e33b0-a941-4486-af32-c561ca8d32db-kube-api-access-5s89t\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:32 crc kubenswrapper[4965]: I1125 15:39:32.867371 4965 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:32 crc kubenswrapper[4965]: I1125 15:39:32.867383 4965 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/158e33b0-a941-4486-af32-c561ca8d32db-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.253870 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" event={"ID":"158e33b0-a941-4486-af32-c561ca8d32db","Type":"ContainerDied","Data":"8b1172185cb98fbbbf23d23288981c41309c278d2d0bf1d2bb0ce320a66efae3"} Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.253921 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1172185cb98fbbbf23d23288981c41309c278d2d0bf1d2bb0ce320a66efae3" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.254035 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9zx8m" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.335379 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz"] Nov 25 15:39:33 crc kubenswrapper[4965]: E1125 15:39:33.336051 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerName="extract-utilities" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.336076 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerName="extract-utilities" Nov 25 15:39:33 crc kubenswrapper[4965]: E1125 15:39:33.336099 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerName="registry-server" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.336127 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerName="registry-server" Nov 25 15:39:33 crc kubenswrapper[4965]: E1125 15:39:33.336144 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerName="extract-content" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.336153 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerName="extract-content" Nov 25 15:39:33 crc kubenswrapper[4965]: E1125 15:39:33.336169 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158e33b0-a941-4486-af32-c561ca8d32db" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.336177 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="158e33b0-a941-4486-af32-c561ca8d32db" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.336390 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="158e33b0-a941-4486-af32-c561ca8d32db" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.336423 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f473c0-fa95-4926-ae9d-e1290ca6265e" containerName="registry-server" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.337332 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.349799 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.350203 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.350312 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz"] Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.350330 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.350485 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x9c2x" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.476827 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5bq\" (UniqueName: \"kubernetes.io/projected/9f4689ce-557b-4194-9502-fe642064225e-kube-api-access-cf5bq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.477381 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.477639 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.579809 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.579920 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.580001 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5bq\" (UniqueName: \"kubernetes.io/projected/9f4689ce-557b-4194-9502-fe642064225e-kube-api-access-cf5bq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.590213 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.590249 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.610033 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5bq\" (UniqueName: \"kubernetes.io/projected/9f4689ce-557b-4194-9502-fe642064225e-kube-api-access-cf5bq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:33 crc kubenswrapper[4965]: I1125 15:39:33.661116 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:34 crc kubenswrapper[4965]: I1125 15:39:34.249489 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz"] Nov 25 15:39:35 crc kubenswrapper[4965]: I1125 15:39:35.048743 4965 scope.go:117] "RemoveContainer" containerID="334477545a2b33183c8778624e141b60ac0c37b3664304d1268e5cef3619157a" Nov 25 15:39:35 crc kubenswrapper[4965]: I1125 15:39:35.270047 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" event={"ID":"9f4689ce-557b-4194-9502-fe642064225e","Type":"ContainerStarted","Data":"c72051dabebe16951100656d13982e07aa215eea8759802d95a4cb74864f0c29"} Nov 25 15:39:35 crc kubenswrapper[4965]: I1125 15:39:35.270100 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" event={"ID":"9f4689ce-557b-4194-9502-fe642064225e","Type":"ContainerStarted","Data":"68250719d38f14619983a0ee1eea1dcb1461aa5aac80610a0ddd5cfad5317107"} Nov 25 15:39:35 crc kubenswrapper[4965]: I1125 15:39:35.288172 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" podStartSLOduration=1.866753223 podStartE2EDuration="2.288146061s" podCreationTimestamp="2025-11-25 15:39:33 +0000 UTC" firstStartedPulling="2025-11-25 15:39:34.258729769 +0000 UTC m=+2119.226323515" lastFinishedPulling="2025-11-25 15:39:34.680122607 +0000 UTC m=+2119.647716353" observedRunningTime="2025-11-25 15:39:35.285697294 +0000 UTC m=+2120.253291050" watchObservedRunningTime="2025-11-25 15:39:35.288146061 +0000 UTC m=+2120.255739817" Nov 25 15:39:44 crc kubenswrapper[4965]: I1125 15:39:44.071570 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-qmnc9"] Nov 25 15:39:44 crc kubenswrapper[4965]: I1125 15:39:44.081061 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-qmnc9"] Nov 25 15:39:44 crc kubenswrapper[4965]: I1125 15:39:44.783391 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="697f5736-bc04-45e5-a874-982e8cd7d8e3" path="/var/lib/kubelet/pods/697f5736-bc04-45e5-a874-982e8cd7d8e3/volumes" Nov 25 15:39:46 crc kubenswrapper[4965]: I1125 15:39:46.358452 4965 generic.go:334] "Generic (PLEG): container finished" podID="9f4689ce-557b-4194-9502-fe642064225e" containerID="c72051dabebe16951100656d13982e07aa215eea8759802d95a4cb74864f0c29" exitCode=0 Nov 25 15:39:46 crc kubenswrapper[4965]: I1125 15:39:46.358485 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" event={"ID":"9f4689ce-557b-4194-9502-fe642064225e","Type":"ContainerDied","Data":"c72051dabebe16951100656d13982e07aa215eea8759802d95a4cb74864f0c29"} Nov 25 15:39:47 crc kubenswrapper[4965]: I1125 15:39:47.775151 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:47 crc kubenswrapper[4965]: I1125 15:39:47.840737 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-ssh-key\") pod \"9f4689ce-557b-4194-9502-fe642064225e\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " Nov 25 15:39:47 crc kubenswrapper[4965]: I1125 15:39:47.840919 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-inventory\") pod \"9f4689ce-557b-4194-9502-fe642064225e\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " Nov 25 15:39:47 crc kubenswrapper[4965]: I1125 15:39:47.841033 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf5bq\" (UniqueName: \"kubernetes.io/projected/9f4689ce-557b-4194-9502-fe642064225e-kube-api-access-cf5bq\") pod \"9f4689ce-557b-4194-9502-fe642064225e\" (UID: \"9f4689ce-557b-4194-9502-fe642064225e\") " Nov 25 15:39:47 crc kubenswrapper[4965]: I1125 15:39:47.854329 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4689ce-557b-4194-9502-fe642064225e-kube-api-access-cf5bq" (OuterVolumeSpecName: "kube-api-access-cf5bq") pod "9f4689ce-557b-4194-9502-fe642064225e" (UID: "9f4689ce-557b-4194-9502-fe642064225e"). InnerVolumeSpecName "kube-api-access-cf5bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:39:47 crc kubenswrapper[4965]: I1125 15:39:47.869259 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-inventory" (OuterVolumeSpecName: "inventory") pod "9f4689ce-557b-4194-9502-fe642064225e" (UID: "9f4689ce-557b-4194-9502-fe642064225e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:39:47 crc kubenswrapper[4965]: I1125 15:39:47.877058 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f4689ce-557b-4194-9502-fe642064225e" (UID: "9f4689ce-557b-4194-9502-fe642064225e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:39:47 crc kubenswrapper[4965]: I1125 15:39:47.943193 4965 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:47 crc kubenswrapper[4965]: I1125 15:39:47.943250 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf5bq\" (UniqueName: \"kubernetes.io/projected/9f4689ce-557b-4194-9502-fe642064225e-kube-api-access-cf5bq\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:47 crc kubenswrapper[4965]: I1125 15:39:47.943262 4965 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f4689ce-557b-4194-9502-fe642064225e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:48 crc kubenswrapper[4965]: I1125 15:39:48.373825 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" event={"ID":"9f4689ce-557b-4194-9502-fe642064225e","Type":"ContainerDied","Data":"68250719d38f14619983a0ee1eea1dcb1461aa5aac80610a0ddd5cfad5317107"} Nov 25 15:39:48 crc kubenswrapper[4965]: I1125 15:39:48.373881 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68250719d38f14619983a0ee1eea1dcb1461aa5aac80610a0ddd5cfad5317107" Nov 25 15:39:48 crc kubenswrapper[4965]: I1125 15:39:48.373947 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz" Nov 25 15:39:53 crc kubenswrapper[4965]: I1125 15:39:53.041078 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r7rws"] Nov 25 15:39:53 crc kubenswrapper[4965]: I1125 15:39:53.055787 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r7rws"] Nov 25 15:39:54 crc kubenswrapper[4965]: I1125 15:39:54.780110 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="814b2e75-fe75-4874-b493-f739902f9202" path="/var/lib/kubelet/pods/814b2e75-fe75-4874-b493-f739902f9202/volumes" Nov 25 15:40:29 crc kubenswrapper[4965]: I1125 15:40:29.048687 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xs2g6"] Nov 25 15:40:29 crc kubenswrapper[4965]: I1125 15:40:29.071189 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xs2g6"] Nov 25 15:40:30 crc kubenswrapper[4965]: I1125 15:40:30.790686 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c0d36b9-701f-42e2-9a0d-03d0db171c59" path="/var/lib/kubelet/pods/2c0d36b9-701f-42e2-9a0d-03d0db171c59/volumes" Nov 25 15:40:35 crc kubenswrapper[4965]: I1125 15:40:35.147744 4965 scope.go:117] "RemoveContainer" containerID="22c5bab323a242afa2cf3d32c11e12c884aa608430fc5ea7d63e62abf8bcbdfb" Nov 25 15:40:35 crc kubenswrapper[4965]: I1125 15:40:35.215236 4965 scope.go:117] "RemoveContainer" containerID="cb9a4beaf181edc3422d7d7eafc3531f7cecdb96b1dfb138abfaef7b5872ff7b" Nov 25 15:40:35 crc kubenswrapper[4965]: I1125 15:40:35.282900 4965 scope.go:117] "RemoveContainer" containerID="e3e6372085103461a2828509236bdbd47361dcb86a73d0c276e3e6e89ef91891" Nov 25 15:40:53 crc kubenswrapper[4965]: I1125 15:40:53.260341 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:40:53 crc kubenswrapper[4965]: I1125 15:40:53.260949 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.422201 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lb7rf"] Nov 25 15:41:04 crc kubenswrapper[4965]: E1125 15:41:04.423598 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4689ce-557b-4194-9502-fe642064225e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.423623 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4689ce-557b-4194-9502-fe642064225e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.424022 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4689ce-557b-4194-9502-fe642064225e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.426581 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.433693 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb7rf"] Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.504802 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-catalog-content\") pod \"redhat-marketplace-lb7rf\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.505162 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxpn9\" (UniqueName: \"kubernetes.io/projected/4a249dbf-2750-4720-b4f5-3e594b716744-kube-api-access-wxpn9\") pod \"redhat-marketplace-lb7rf\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.505199 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-utilities\") pod \"redhat-marketplace-lb7rf\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.606719 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxpn9\" (UniqueName: \"kubernetes.io/projected/4a249dbf-2750-4720-b4f5-3e594b716744-kube-api-access-wxpn9\") pod \"redhat-marketplace-lb7rf\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.606802 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-utilities\") pod \"redhat-marketplace-lb7rf\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.606951 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-catalog-content\") pod \"redhat-marketplace-lb7rf\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.607334 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-utilities\") pod \"redhat-marketplace-lb7rf\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.607399 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-catalog-content\") pod \"redhat-marketplace-lb7rf\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.625190 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxpn9\" (UniqueName: \"kubernetes.io/projected/4a249dbf-2750-4720-b4f5-3e594b716744-kube-api-access-wxpn9\") pod \"redhat-marketplace-lb7rf\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:04 crc kubenswrapper[4965]: I1125 15:41:04.744763 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:05 crc kubenswrapper[4965]: I1125 15:41:05.199372 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb7rf"] Nov 25 15:41:05 crc kubenswrapper[4965]: W1125 15:41:05.212229 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a249dbf_2750_4720_b4f5_3e594b716744.slice/crio-42cdce40f7b030ffc4a70d2f0258ffbc2996c174850fa9cf02c33ce22ad10173 WatchSource:0}: Error finding container 42cdce40f7b030ffc4a70d2f0258ffbc2996c174850fa9cf02c33ce22ad10173: Status 404 returned error can't find the container with id 42cdce40f7b030ffc4a70d2f0258ffbc2996c174850fa9cf02c33ce22ad10173 Nov 25 15:41:06 crc kubenswrapper[4965]: I1125 15:41:06.139451 4965 generic.go:334] "Generic (PLEG): container finished" podID="4a249dbf-2750-4720-b4f5-3e594b716744" containerID="177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7" exitCode=0 Nov 25 15:41:06 crc kubenswrapper[4965]: I1125 15:41:06.139685 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb7rf" event={"ID":"4a249dbf-2750-4720-b4f5-3e594b716744","Type":"ContainerDied","Data":"177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7"} Nov 25 15:41:06 crc kubenswrapper[4965]: I1125 15:41:06.139946 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb7rf" event={"ID":"4a249dbf-2750-4720-b4f5-3e594b716744","Type":"ContainerStarted","Data":"42cdce40f7b030ffc4a70d2f0258ffbc2996c174850fa9cf02c33ce22ad10173"} Nov 25 15:41:06 crc kubenswrapper[4965]: I1125 15:41:06.143715 4965 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:41:07 crc kubenswrapper[4965]: I1125 15:41:07.162424 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb7rf" event={"ID":"4a249dbf-2750-4720-b4f5-3e594b716744","Type":"ContainerStarted","Data":"6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad"} Nov 25 15:41:08 crc kubenswrapper[4965]: I1125 15:41:08.178343 4965 generic.go:334] "Generic (PLEG): container finished" podID="4a249dbf-2750-4720-b4f5-3e594b716744" containerID="6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad" exitCode=0 Nov 25 15:41:08 crc kubenswrapper[4965]: I1125 15:41:08.178400 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb7rf" event={"ID":"4a249dbf-2750-4720-b4f5-3e594b716744","Type":"ContainerDied","Data":"6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad"} Nov 25 15:41:09 crc kubenswrapper[4965]: I1125 15:41:09.190705 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb7rf" event={"ID":"4a249dbf-2750-4720-b4f5-3e594b716744","Type":"ContainerStarted","Data":"0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef"} Nov 25 15:41:09 crc kubenswrapper[4965]: I1125 15:41:09.213634 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lb7rf" podStartSLOduration=2.484701279 podStartE2EDuration="5.213616695s" podCreationTimestamp="2025-11-25 15:41:04 +0000 UTC" firstStartedPulling="2025-11-25 15:41:06.143396909 +0000 UTC m=+2211.110990665" lastFinishedPulling="2025-11-25 15:41:08.872312325 +0000 UTC m=+2213.839906081" observedRunningTime="2025-11-25 15:41:09.207741765 +0000 UTC m=+2214.175335521" watchObservedRunningTime="2025-11-25 15:41:09.213616695 +0000 UTC m=+2214.181210431" Nov 25 15:41:14 crc kubenswrapper[4965]: I1125 15:41:14.745358 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:14 crc kubenswrapper[4965]: I1125 15:41:14.746180 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:14 crc kubenswrapper[4965]: I1125 15:41:14.808795 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:15 crc kubenswrapper[4965]: I1125 15:41:15.303710 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:15 crc kubenswrapper[4965]: I1125 15:41:15.375522 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb7rf"] Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.257510 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lb7rf" podUID="4a249dbf-2750-4720-b4f5-3e594b716744" containerName="registry-server" containerID="cri-o://0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef" gracePeriod=2 Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.735464 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.884876 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxpn9\" (UniqueName: \"kubernetes.io/projected/4a249dbf-2750-4720-b4f5-3e594b716744-kube-api-access-wxpn9\") pod \"4a249dbf-2750-4720-b4f5-3e594b716744\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.885030 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-catalog-content\") pod \"4a249dbf-2750-4720-b4f5-3e594b716744\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.885140 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-utilities\") pod \"4a249dbf-2750-4720-b4f5-3e594b716744\" (UID: \"4a249dbf-2750-4720-b4f5-3e594b716744\") " Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.887667 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-utilities" (OuterVolumeSpecName: "utilities") pod "4a249dbf-2750-4720-b4f5-3e594b716744" (UID: "4a249dbf-2750-4720-b4f5-3e594b716744"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.893137 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a249dbf-2750-4720-b4f5-3e594b716744-kube-api-access-wxpn9" (OuterVolumeSpecName: "kube-api-access-wxpn9") pod "4a249dbf-2750-4720-b4f5-3e594b716744" (UID: "4a249dbf-2750-4720-b4f5-3e594b716744"). InnerVolumeSpecName "kube-api-access-wxpn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.915515 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a249dbf-2750-4720-b4f5-3e594b716744" (UID: "4a249dbf-2750-4720-b4f5-3e594b716744"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.987919 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.988389 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a249dbf-2750-4720-b4f5-3e594b716744-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:41:17 crc kubenswrapper[4965]: I1125 15:41:17.988411 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxpn9\" (UniqueName: \"kubernetes.io/projected/4a249dbf-2750-4720-b4f5-3e594b716744-kube-api-access-wxpn9\") on node \"crc\" DevicePath \"\"" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.268797 4965 generic.go:334] "Generic (PLEG): container finished" podID="4a249dbf-2750-4720-b4f5-3e594b716744" containerID="0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef" exitCode=0 Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.268842 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb7rf" event={"ID":"4a249dbf-2750-4720-b4f5-3e594b716744","Type":"ContainerDied","Data":"0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef"} Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.268854 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb7rf" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.268872 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb7rf" event={"ID":"4a249dbf-2750-4720-b4f5-3e594b716744","Type":"ContainerDied","Data":"42cdce40f7b030ffc4a70d2f0258ffbc2996c174850fa9cf02c33ce22ad10173"} Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.268894 4965 scope.go:117] "RemoveContainer" containerID="0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.286797 4965 scope.go:117] "RemoveContainer" containerID="6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.318034 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb7rf"] Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.321109 4965 scope.go:117] "RemoveContainer" containerID="177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.324365 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb7rf"] Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.360646 4965 scope.go:117] "RemoveContainer" containerID="0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef" Nov 25 15:41:18 crc kubenswrapper[4965]: E1125 15:41:18.365258 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef\": container with ID starting with 0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef not found: ID does not exist" containerID="0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.365304 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef"} err="failed to get container status \"0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef\": rpc error: code = NotFound desc = could not find container \"0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef\": container with ID starting with 0d67eb2d8ebd34cc1081a0200b7cb03aea0f3ef807e3531719e0dfa620c2ebef not found: ID does not exist" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.365332 4965 scope.go:117] "RemoveContainer" containerID="6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad" Nov 25 15:41:18 crc kubenswrapper[4965]: E1125 15:41:18.365720 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad\": container with ID starting with 6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad not found: ID does not exist" containerID="6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.365757 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad"} err="failed to get container status \"6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad\": rpc error: code = NotFound desc = could not find container \"6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad\": container with ID starting with 6cf756ab8547f51b4eb15a2b8b4abf4f9c0ac14a1b2ef3f07b15f2d3e01233ad not found: ID does not exist" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.365785 4965 scope.go:117] "RemoveContainer" containerID="177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7" Nov 25 15:41:18 crc kubenswrapper[4965]: E1125 15:41:18.366244 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7\": container with ID starting with 177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7 not found: ID does not exist" containerID="177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.366272 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7"} err="failed to get container status \"177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7\": rpc error: code = NotFound desc = could not find container \"177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7\": container with ID starting with 177fe59262e4026b21da0287033eb13354e7f18e19f5351e47ce007720d6d7c7 not found: ID does not exist" Nov 25 15:41:18 crc kubenswrapper[4965]: I1125 15:41:18.782023 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a249dbf-2750-4720-b4f5-3e594b716744" path="/var/lib/kubelet/pods/4a249dbf-2750-4720-b4f5-3e594b716744/volumes" Nov 25 15:41:23 crc kubenswrapper[4965]: I1125 15:41:23.260683 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:41:23 crc kubenswrapper[4965]: I1125 15:41:23.261365 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.389309 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hzw85"] Nov 25 15:41:32 crc kubenswrapper[4965]: E1125 15:41:32.390205 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a249dbf-2750-4720-b4f5-3e594b716744" containerName="registry-server" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.390219 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a249dbf-2750-4720-b4f5-3e594b716744" containerName="registry-server" Nov 25 15:41:32 crc kubenswrapper[4965]: E1125 15:41:32.390242 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a249dbf-2750-4720-b4f5-3e594b716744" containerName="extract-utilities" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.390248 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a249dbf-2750-4720-b4f5-3e594b716744" containerName="extract-utilities" Nov 25 15:41:32 crc kubenswrapper[4965]: E1125 15:41:32.390269 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a249dbf-2750-4720-b4f5-3e594b716744" containerName="extract-content" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.390277 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a249dbf-2750-4720-b4f5-3e594b716744" containerName="extract-content" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.390471 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a249dbf-2750-4720-b4f5-3e594b716744" containerName="registry-server" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.391697 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.397368 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hzw85"] Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.430676 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-catalog-content\") pod \"certified-operators-hzw85\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.431075 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-utilities\") pod \"certified-operators-hzw85\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.431108 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zr7x\" (UniqueName: \"kubernetes.io/projected/83fa5af3-ba75-45b0-b4e2-dba873e53b24-kube-api-access-6zr7x\") pod \"certified-operators-hzw85\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.532590 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-catalog-content\") pod \"certified-operators-hzw85\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.532732 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-utilities\") pod \"certified-operators-hzw85\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.532759 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zr7x\" (UniqueName: \"kubernetes.io/projected/83fa5af3-ba75-45b0-b4e2-dba873e53b24-kube-api-access-6zr7x\") pod \"certified-operators-hzw85\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.533095 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-catalog-content\") pod \"certified-operators-hzw85\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.533174 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-utilities\") pod \"certified-operators-hzw85\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.550693 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zr7x\" (UniqueName: \"kubernetes.io/projected/83fa5af3-ba75-45b0-b4e2-dba873e53b24-kube-api-access-6zr7x\") pod \"certified-operators-hzw85\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:32 crc kubenswrapper[4965]: I1125 15:41:32.722451 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:33 crc kubenswrapper[4965]: I1125 15:41:33.310383 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hzw85"] Nov 25 15:41:33 crc kubenswrapper[4965]: I1125 15:41:33.503538 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzw85" event={"ID":"83fa5af3-ba75-45b0-b4e2-dba873e53b24","Type":"ContainerStarted","Data":"c6ccb162eeecbb9422597c7c4bee56d08140d92c6ffd530df3d1552321d70483"} Nov 25 15:41:34 crc kubenswrapper[4965]: I1125 15:41:34.513344 4965 generic.go:334] "Generic (PLEG): container finished" podID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerID="e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a" exitCode=0 Nov 25 15:41:34 crc kubenswrapper[4965]: I1125 15:41:34.513393 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzw85" event={"ID":"83fa5af3-ba75-45b0-b4e2-dba873e53b24","Type":"ContainerDied","Data":"e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a"} Nov 25 15:41:35 crc kubenswrapper[4965]: I1125 15:41:35.527037 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzw85" event={"ID":"83fa5af3-ba75-45b0-b4e2-dba873e53b24","Type":"ContainerStarted","Data":"7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10"} Nov 25 15:41:36 crc kubenswrapper[4965]: I1125 15:41:36.537442 4965 generic.go:334] "Generic (PLEG): container finished" podID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerID="7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10" exitCode=0 Nov 25 15:41:36 crc kubenswrapper[4965]: I1125 15:41:36.537541 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzw85" event={"ID":"83fa5af3-ba75-45b0-b4e2-dba873e53b24","Type":"ContainerDied","Data":"7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10"} Nov 25 15:41:37 crc kubenswrapper[4965]: I1125 15:41:37.552274 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzw85" event={"ID":"83fa5af3-ba75-45b0-b4e2-dba873e53b24","Type":"ContainerStarted","Data":"324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875"} Nov 25 15:41:37 crc kubenswrapper[4965]: I1125 15:41:37.578730 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hzw85" podStartSLOduration=3.13273345 podStartE2EDuration="5.578708961s" podCreationTimestamp="2025-11-25 15:41:32 +0000 UTC" firstStartedPulling="2025-11-25 15:41:34.516471052 +0000 UTC m=+2239.484064808" lastFinishedPulling="2025-11-25 15:41:36.962446573 +0000 UTC m=+2241.930040319" observedRunningTime="2025-11-25 15:41:37.569795508 +0000 UTC m=+2242.537389274" watchObservedRunningTime="2025-11-25 15:41:37.578708961 +0000 UTC m=+2242.546302727" Nov 25 15:41:42 crc kubenswrapper[4965]: I1125 15:41:42.722770 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:42 crc kubenswrapper[4965]: I1125 15:41:42.723464 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:42 crc kubenswrapper[4965]: I1125 15:41:42.782769 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:43 crc kubenswrapper[4965]: I1125 15:41:43.656417 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:43 crc kubenswrapper[4965]: I1125 15:41:43.716926 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hzw85"] Nov 25 15:41:45 crc kubenswrapper[4965]: I1125 15:41:45.624600 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hzw85" podUID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerName="registry-server" containerID="cri-o://324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875" gracePeriod=2 Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.107321 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.208388 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-utilities\") pod \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.208449 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-catalog-content\") pod \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.208524 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zr7x\" (UniqueName: \"kubernetes.io/projected/83fa5af3-ba75-45b0-b4e2-dba873e53b24-kube-api-access-6zr7x\") pod \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\" (UID: \"83fa5af3-ba75-45b0-b4e2-dba873e53b24\") " Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.209667 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-utilities" (OuterVolumeSpecName: "utilities") pod "83fa5af3-ba75-45b0-b4e2-dba873e53b24" (UID: "83fa5af3-ba75-45b0-b4e2-dba873e53b24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.222352 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83fa5af3-ba75-45b0-b4e2-dba873e53b24-kube-api-access-6zr7x" (OuterVolumeSpecName: "kube-api-access-6zr7x") pod "83fa5af3-ba75-45b0-b4e2-dba873e53b24" (UID: "83fa5af3-ba75-45b0-b4e2-dba873e53b24"). InnerVolumeSpecName "kube-api-access-6zr7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.310452 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.310508 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zr7x\" (UniqueName: \"kubernetes.io/projected/83fa5af3-ba75-45b0-b4e2-dba873e53b24-kube-api-access-6zr7x\") on node \"crc\" DevicePath \"\"" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.518957 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83fa5af3-ba75-45b0-b4e2-dba873e53b24" (UID: "83fa5af3-ba75-45b0-b4e2-dba873e53b24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.615299 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83fa5af3-ba75-45b0-b4e2-dba873e53b24-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.635025 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hzw85" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.635070 4965 generic.go:334] "Generic (PLEG): container finished" podID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerID="324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875" exitCode=0 Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.635124 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzw85" event={"ID":"83fa5af3-ba75-45b0-b4e2-dba873e53b24","Type":"ContainerDied","Data":"324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875"} Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.635161 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzw85" event={"ID":"83fa5af3-ba75-45b0-b4e2-dba873e53b24","Type":"ContainerDied","Data":"c6ccb162eeecbb9422597c7c4bee56d08140d92c6ffd530df3d1552321d70483"} Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.635184 4965 scope.go:117] "RemoveContainer" containerID="324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.656737 4965 scope.go:117] "RemoveContainer" containerID="7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.675465 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hzw85"] Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.684749 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hzw85"] Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.699074 4965 scope.go:117] "RemoveContainer" containerID="e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.739623 4965 scope.go:117] "RemoveContainer" containerID="324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875" Nov 25 15:41:46 crc kubenswrapper[4965]: E1125 15:41:46.740211 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875\": container with ID starting with 324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875 not found: ID does not exist" containerID="324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.740246 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875"} err="failed to get container status \"324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875\": rpc error: code = NotFound desc = could not find container \"324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875\": container with ID starting with 324e249ac6b725fd7f0b3ecbfdf79349763562230ec51e016299aeb074c52875 not found: ID does not exist" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.740272 4965 scope.go:117] "RemoveContainer" containerID="7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10" Nov 25 15:41:46 crc kubenswrapper[4965]: E1125 15:41:46.741234 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10\": container with ID starting with 7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10 not found: ID does not exist" containerID="7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.741380 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10"} err="failed to get container status \"7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10\": rpc error: code = NotFound desc = could not find container \"7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10\": container with ID starting with 7fd55aef8f1b7efba7e14b98a7f278a9a19de942b0226ba6ca4e1850deebcb10 not found: ID does not exist" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.741499 4965 scope.go:117] "RemoveContainer" containerID="e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a" Nov 25 15:41:46 crc kubenswrapper[4965]: E1125 15:41:46.742045 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a\": container with ID starting with e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a not found: ID does not exist" containerID="e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.742080 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a"} err="failed to get container status \"e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a\": rpc error: code = NotFound desc = could not find container \"e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a\": container with ID starting with e63ec41db59fd1f565571b9ada6335dab28d15d154eee7d5f4641de8fabeb30a not found: ID does not exist" Nov 25 15:41:46 crc kubenswrapper[4965]: I1125 15:41:46.781089 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" path="/var/lib/kubelet/pods/83fa5af3-ba75-45b0-b4e2-dba873e53b24/volumes" Nov 25 15:41:53 crc kubenswrapper[4965]: I1125 15:41:53.260398 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:41:53 crc kubenswrapper[4965]: I1125 15:41:53.260998 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:41:53 crc kubenswrapper[4965]: I1125 15:41:53.261054 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:41:53 crc kubenswrapper[4965]: I1125 15:41:53.261836 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:41:53 crc kubenswrapper[4965]: I1125 15:41:53.261897 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" gracePeriod=600 Nov 25 15:41:53 crc kubenswrapper[4965]: E1125 15:41:53.407184 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:41:53 crc kubenswrapper[4965]: I1125 15:41:53.715866 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" exitCode=0 Nov 25 15:41:53 crc kubenswrapper[4965]: I1125 15:41:53.715944 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24"} Nov 25 15:41:53 crc kubenswrapper[4965]: I1125 15:41:53.716025 4965 scope.go:117] "RemoveContainer" containerID="a78849570ebdec0ecb7f8223886210e662c725014f30c20e18e315182e077c80" Nov 25 15:41:53 crc kubenswrapper[4965]: I1125 15:41:53.716647 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:41:53 crc kubenswrapper[4965]: E1125 15:41:53.716882 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.638943 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bqbn5"] Nov 25 15:42:02 crc kubenswrapper[4965]: E1125 15:42:02.640099 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerName="registry-server" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.640116 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerName="registry-server" Nov 25 15:42:02 crc kubenswrapper[4965]: E1125 15:42:02.640135 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerName="extract-content" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.640141 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerName="extract-content" Nov 25 15:42:02 crc kubenswrapper[4965]: E1125 15:42:02.640166 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerName="extract-utilities" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.640173 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerName="extract-utilities" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.640397 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="83fa5af3-ba75-45b0-b4e2-dba873e53b24" containerName="registry-server" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.641914 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.653506 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bqbn5"] Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.738366 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-utilities\") pod \"community-operators-bqbn5\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.738471 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzb8\" (UniqueName: \"kubernetes.io/projected/f4f12961-0b9d-4525-a8e4-98d6f3700db1-kube-api-access-cmzb8\") pod \"community-operators-bqbn5\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.738523 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-catalog-content\") pod \"community-operators-bqbn5\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.840455 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-utilities\") pod \"community-operators-bqbn5\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.840532 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzb8\" (UniqueName: \"kubernetes.io/projected/f4f12961-0b9d-4525-a8e4-98d6f3700db1-kube-api-access-cmzb8\") pod \"community-operators-bqbn5\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.840580 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-catalog-content\") pod \"community-operators-bqbn5\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.841098 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-utilities\") pod \"community-operators-bqbn5\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.841172 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-catalog-content\") pod \"community-operators-bqbn5\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.867498 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzb8\" (UniqueName: \"kubernetes.io/projected/f4f12961-0b9d-4525-a8e4-98d6f3700db1-kube-api-access-cmzb8\") pod \"community-operators-bqbn5\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:02 crc kubenswrapper[4965]: I1125 15:42:02.973821 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:03 crc kubenswrapper[4965]: I1125 15:42:03.574420 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bqbn5"] Nov 25 15:42:03 crc kubenswrapper[4965]: I1125 15:42:03.805380 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqbn5" event={"ID":"f4f12961-0b9d-4525-a8e4-98d6f3700db1","Type":"ContainerStarted","Data":"d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa"} Nov 25 15:42:03 crc kubenswrapper[4965]: I1125 15:42:03.805758 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqbn5" event={"ID":"f4f12961-0b9d-4525-a8e4-98d6f3700db1","Type":"ContainerStarted","Data":"11eb2e1dca78bdd65ab7e6ff33a19a87d8a98d2c6f5978b980d9f8bd11aed997"} Nov 25 15:42:04 crc kubenswrapper[4965]: I1125 15:42:04.815581 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerID="d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa" exitCode=0 Nov 25 15:42:04 crc kubenswrapper[4965]: I1125 15:42:04.815622 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqbn5" event={"ID":"f4f12961-0b9d-4525-a8e4-98d6f3700db1","Type":"ContainerDied","Data":"d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa"} Nov 25 15:42:05 crc kubenswrapper[4965]: I1125 15:42:05.826930 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqbn5" event={"ID":"f4f12961-0b9d-4525-a8e4-98d6f3700db1","Type":"ContainerStarted","Data":"153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34"} Nov 25 15:42:06 crc kubenswrapper[4965]: I1125 15:42:06.778625 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:42:06 crc kubenswrapper[4965]: E1125 15:42:06.779209 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:42:06 crc kubenswrapper[4965]: I1125 15:42:06.839375 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqbn5" event={"ID":"f4f12961-0b9d-4525-a8e4-98d6f3700db1","Type":"ContainerDied","Data":"153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34"} Nov 25 15:42:06 crc kubenswrapper[4965]: I1125 15:42:06.839502 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerID="153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34" exitCode=0 Nov 25 15:42:07 crc kubenswrapper[4965]: I1125 15:42:07.868192 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqbn5" event={"ID":"f4f12961-0b9d-4525-a8e4-98d6f3700db1","Type":"ContainerStarted","Data":"a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1"} Nov 25 15:42:07 crc kubenswrapper[4965]: I1125 15:42:07.894451 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bqbn5" podStartSLOduration=3.206029193 podStartE2EDuration="5.894435602s" podCreationTimestamp="2025-11-25 15:42:02 +0000 UTC" firstStartedPulling="2025-11-25 15:42:04.817214004 +0000 UTC m=+2269.784807750" lastFinishedPulling="2025-11-25 15:42:07.505620413 +0000 UTC m=+2272.473214159" observedRunningTime="2025-11-25 15:42:07.888644503 +0000 UTC m=+2272.856238249" watchObservedRunningTime="2025-11-25 15:42:07.894435602 +0000 UTC m=+2272.862029338" Nov 25 15:42:12 crc kubenswrapper[4965]: I1125 15:42:12.975154 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:12 crc kubenswrapper[4965]: I1125 15:42:12.977206 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:13 crc kubenswrapper[4965]: I1125 15:42:13.052790 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:13 crc kubenswrapper[4965]: I1125 15:42:13.954069 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:13 crc kubenswrapper[4965]: I1125 15:42:13.998344 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bqbn5"] Nov 25 15:42:15 crc kubenswrapper[4965]: I1125 15:42:15.934031 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bqbn5" podUID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerName="registry-server" containerID="cri-o://a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1" gracePeriod=2 Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.448443 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.608540 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmzb8\" (UniqueName: \"kubernetes.io/projected/f4f12961-0b9d-4525-a8e4-98d6f3700db1-kube-api-access-cmzb8\") pod \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.609173 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-catalog-content\") pod \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.609325 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-utilities\") pod \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\" (UID: \"f4f12961-0b9d-4525-a8e4-98d6f3700db1\") " Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.610711 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-utilities" (OuterVolumeSpecName: "utilities") pod "f4f12961-0b9d-4525-a8e4-98d6f3700db1" (UID: "f4f12961-0b9d-4525-a8e4-98d6f3700db1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.617507 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f12961-0b9d-4525-a8e4-98d6f3700db1-kube-api-access-cmzb8" (OuterVolumeSpecName: "kube-api-access-cmzb8") pod "f4f12961-0b9d-4525-a8e4-98d6f3700db1" (UID: "f4f12961-0b9d-4525-a8e4-98d6f3700db1"). InnerVolumeSpecName "kube-api-access-cmzb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.692593 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4f12961-0b9d-4525-a8e4-98d6f3700db1" (UID: "f4f12961-0b9d-4525-a8e4-98d6f3700db1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.711906 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.711945 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmzb8\" (UniqueName: \"kubernetes.io/projected/f4f12961-0b9d-4525-a8e4-98d6f3700db1-kube-api-access-cmzb8\") on node \"crc\" DevicePath \"\"" Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.711960 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f12961-0b9d-4525-a8e4-98d6f3700db1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.948889 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerID="a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1" exitCode=0 Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.949011 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqbn5" event={"ID":"f4f12961-0b9d-4525-a8e4-98d6f3700db1","Type":"ContainerDied","Data":"a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1"} Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.949069 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqbn5" event={"ID":"f4f12961-0b9d-4525-a8e4-98d6f3700db1","Type":"ContainerDied","Data":"11eb2e1dca78bdd65ab7e6ff33a19a87d8a98d2c6f5978b980d9f8bd11aed997"} Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.949078 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqbn5" Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.949088 4965 scope.go:117] "RemoveContainer" containerID="a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1" Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.986543 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bqbn5"] Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.995957 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bqbn5"] Nov 25 15:42:16 crc kubenswrapper[4965]: I1125 15:42:16.996263 4965 scope.go:117] "RemoveContainer" containerID="153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34" Nov 25 15:42:17 crc kubenswrapper[4965]: I1125 15:42:17.019941 4965 scope.go:117] "RemoveContainer" containerID="d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa" Nov 25 15:42:17 crc kubenswrapper[4965]: I1125 15:42:17.065656 4965 scope.go:117] "RemoveContainer" containerID="a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1" Nov 25 15:42:17 crc kubenswrapper[4965]: E1125 15:42:17.066028 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1\": container with ID starting with a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1 not found: ID does not exist" containerID="a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1" Nov 25 15:42:17 crc kubenswrapper[4965]: I1125 15:42:17.066055 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1"} err="failed to get container status \"a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1\": rpc error: code = NotFound desc = could not find container \"a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1\": container with ID starting with a00af2c8f9b8fb63c63446deb737d878eb8e15583486b993c24962cdaaac66d1 not found: ID does not exist" Nov 25 15:42:17 crc kubenswrapper[4965]: I1125 15:42:17.066088 4965 scope.go:117] "RemoveContainer" containerID="153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34" Nov 25 15:42:17 crc kubenswrapper[4965]: E1125 15:42:17.066349 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34\": container with ID starting with 153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34 not found: ID does not exist" containerID="153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34" Nov 25 15:42:17 crc kubenswrapper[4965]: I1125 15:42:17.066383 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34"} err="failed to get container status \"153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34\": rpc error: code = NotFound desc = could not find container \"153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34\": container with ID starting with 153998830e0d801588d958827a576b95c24af3f0784475cbca8a465a5098ba34 not found: ID does not exist" Nov 25 15:42:17 crc kubenswrapper[4965]: I1125 15:42:17.066409 4965 scope.go:117] "RemoveContainer" containerID="d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa" Nov 25 15:42:17 crc kubenswrapper[4965]: E1125 15:42:17.066911 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa\": container with ID starting with d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa not found: ID does not exist" containerID="d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa" Nov 25 15:42:17 crc kubenswrapper[4965]: I1125 15:42:17.066955 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa"} err="failed to get container status \"d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa\": rpc error: code = NotFound desc = could not find container \"d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa\": container with ID starting with d418e4078b0b91b4ee7a019e7d5dc7d50a81425de92f3b575eac3cc9a8c11dfa not found: ID does not exist" Nov 25 15:42:18 crc kubenswrapper[4965]: I1125 15:42:18.783257 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" path="/var/lib/kubelet/pods/f4f12961-0b9d-4525-a8e4-98d6f3700db1/volumes" Nov 25 15:42:21 crc kubenswrapper[4965]: I1125 15:42:21.772177 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:42:21 crc kubenswrapper[4965]: E1125 15:42:21.772747 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:42:22 crc kubenswrapper[4965]: E1125 15:42:22.002563 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice/crio-11eb2e1dca78bdd65ab7e6ff33a19a87d8a98d2c6f5978b980d9f8bd11aed997\": RecentStats: unable to find data in memory cache]" Nov 25 15:42:32 crc kubenswrapper[4965]: E1125 15:42:32.240195 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice/crio-11eb2e1dca78bdd65ab7e6ff33a19a87d8a98d2c6f5978b980d9f8bd11aed997\": RecentStats: unable to find data in memory cache]" Nov 25 15:42:36 crc kubenswrapper[4965]: I1125 15:42:36.776986 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:42:36 crc kubenswrapper[4965]: E1125 15:42:36.777677 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:42:42 crc kubenswrapper[4965]: E1125 15:42:42.452143 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice/crio-11eb2e1dca78bdd65ab7e6ff33a19a87d8a98d2c6f5978b980d9f8bd11aed997\": RecentStats: unable to find data in memory cache]" Nov 25 15:42:47 crc kubenswrapper[4965]: I1125 15:42:47.771789 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:42:47 crc kubenswrapper[4965]: E1125 15:42:47.772545 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:42:52 crc kubenswrapper[4965]: E1125 15:42:52.686555 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice/crio-11eb2e1dca78bdd65ab7e6ff33a19a87d8a98d2c6f5978b980d9f8bd11aed997\": RecentStats: unable to find data in memory cache]" Nov 25 15:43:02 crc kubenswrapper[4965]: I1125 15:43:02.771254 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:43:02 crc kubenswrapper[4965]: E1125 15:43:02.772091 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:43:02 crc kubenswrapper[4965]: E1125 15:43:02.902069 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice/crio-11eb2e1dca78bdd65ab7e6ff33a19a87d8a98d2c6f5978b980d9f8bd11aed997\": RecentStats: unable to find data in memory cache]" Nov 25 15:43:13 crc kubenswrapper[4965]: E1125 15:43:13.136628 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice/crio-11eb2e1dca78bdd65ab7e6ff33a19a87d8a98d2c6f5978b980d9f8bd11aed997\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f12961_0b9d_4525_a8e4_98d6f3700db1.slice\": RecentStats: unable to find data in memory cache]" Nov 25 15:43:16 crc kubenswrapper[4965]: I1125 15:43:16.779085 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:43:16 crc kubenswrapper[4965]: E1125 15:43:16.781519 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:43:31 crc kubenswrapper[4965]: I1125 15:43:31.772094 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:43:31 crc kubenswrapper[4965]: E1125 15:43:31.772736 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:43:42 crc kubenswrapper[4965]: I1125 15:43:42.774217 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:43:42 crc kubenswrapper[4965]: E1125 15:43:42.775050 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:43:54 crc kubenswrapper[4965]: I1125 15:43:54.771449 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:43:54 crc kubenswrapper[4965]: E1125 15:43:54.772274 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:44:05 crc kubenswrapper[4965]: I1125 15:44:05.771515 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:44:05 crc kubenswrapper[4965]: E1125 15:44:05.772401 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:44:18 crc kubenswrapper[4965]: I1125 15:44:18.771627 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:44:18 crc kubenswrapper[4965]: E1125 15:44:18.772447 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:44:29 crc kubenswrapper[4965]: I1125 15:44:29.771936 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:44:29 crc kubenswrapper[4965]: E1125 15:44:29.773257 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:44:43 crc kubenswrapper[4965]: I1125 15:44:43.772152 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:44:43 crc kubenswrapper[4965]: E1125 15:44:43.773256 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:44:56 crc kubenswrapper[4965]: I1125 15:44:56.772185 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:44:56 crc kubenswrapper[4965]: E1125 15:44:56.773388 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:44:57 crc kubenswrapper[4965]: E1125 15:44:57.476205 4965 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.176:36428->38.102.83.176:37963: write tcp 38.102.83.176:36428->38.102.83.176:37963: write: broken pipe Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.165133 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2"] Nov 25 15:45:00 crc kubenswrapper[4965]: E1125 15:45:00.166129 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerName="registry-server" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.166145 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerName="registry-server" Nov 25 15:45:00 crc kubenswrapper[4965]: E1125 15:45:00.166160 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerName="extract-content" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.166166 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerName="extract-content" Nov 25 15:45:00 crc kubenswrapper[4965]: E1125 15:45:00.166205 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerName="extract-utilities" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.166212 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerName="extract-utilities" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.166394 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f12961-0b9d-4525-a8e4-98d6f3700db1" containerName="registry-server" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.167160 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.169717 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.170040 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.186308 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2"] Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.368897 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89510618-03cf-4e1d-b8e3-ce053ca16669-config-volume\") pod \"collect-profiles-29401425-l7vb2\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.369185 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89510618-03cf-4e1d-b8e3-ce053ca16669-secret-volume\") pod \"collect-profiles-29401425-l7vb2\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.369318 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc44l\" (UniqueName: \"kubernetes.io/projected/89510618-03cf-4e1d-b8e3-ce053ca16669-kube-api-access-mc44l\") pod \"collect-profiles-29401425-l7vb2\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.470642 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89510618-03cf-4e1d-b8e3-ce053ca16669-config-volume\") pod \"collect-profiles-29401425-l7vb2\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.470681 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89510618-03cf-4e1d-b8e3-ce053ca16669-secret-volume\") pod \"collect-profiles-29401425-l7vb2\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.470726 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc44l\" (UniqueName: \"kubernetes.io/projected/89510618-03cf-4e1d-b8e3-ce053ca16669-kube-api-access-mc44l\") pod \"collect-profiles-29401425-l7vb2\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.471750 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89510618-03cf-4e1d-b8e3-ce053ca16669-config-volume\") pod \"collect-profiles-29401425-l7vb2\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.479665 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89510618-03cf-4e1d-b8e3-ce053ca16669-secret-volume\") pod \"collect-profiles-29401425-l7vb2\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.489721 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc44l\" (UniqueName: \"kubernetes.io/projected/89510618-03cf-4e1d-b8e3-ce053ca16669-kube-api-access-mc44l\") pod \"collect-profiles-29401425-l7vb2\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:00 crc kubenswrapper[4965]: I1125 15:45:00.787599 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:01 crc kubenswrapper[4965]: I1125 15:45:01.254179 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2"] Nov 25 15:45:01 crc kubenswrapper[4965]: I1125 15:45:01.477226 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" event={"ID":"89510618-03cf-4e1d-b8e3-ce053ca16669","Type":"ContainerStarted","Data":"7ae22465d9c4f0914323d31b343d047202762b0b8157d338956188b093d5ff88"} Nov 25 15:45:01 crc kubenswrapper[4965]: I1125 15:45:01.477606 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" event={"ID":"89510618-03cf-4e1d-b8e3-ce053ca16669","Type":"ContainerStarted","Data":"c2f43f865156b8a02c83218cfc4510132f16f2bf871eb062b99fc27e9b10efda"} Nov 25 15:45:01 crc kubenswrapper[4965]: I1125 15:45:01.494660 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" podStartSLOduration=1.49464156 podStartE2EDuration="1.49464156s" podCreationTimestamp="2025-11-25 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:45:01.492904752 +0000 UTC m=+2446.460498508" watchObservedRunningTime="2025-11-25 15:45:01.49464156 +0000 UTC m=+2446.462235306" Nov 25 15:45:02 crc kubenswrapper[4965]: I1125 15:45:02.487990 4965 generic.go:334] "Generic (PLEG): container finished" podID="89510618-03cf-4e1d-b8e3-ce053ca16669" containerID="7ae22465d9c4f0914323d31b343d047202762b0b8157d338956188b093d5ff88" exitCode=0 Nov 25 15:45:02 crc kubenswrapper[4965]: I1125 15:45:02.488040 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" event={"ID":"89510618-03cf-4e1d-b8e3-ce053ca16669","Type":"ContainerDied","Data":"7ae22465d9c4f0914323d31b343d047202762b0b8157d338956188b093d5ff88"} Nov 25 15:45:03 crc kubenswrapper[4965]: I1125 15:45:03.816732 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:03 crc kubenswrapper[4965]: I1125 15:45:03.839191 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89510618-03cf-4e1d-b8e3-ce053ca16669-secret-volume\") pod \"89510618-03cf-4e1d-b8e3-ce053ca16669\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " Nov 25 15:45:03 crc kubenswrapper[4965]: I1125 15:45:03.839269 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc44l\" (UniqueName: \"kubernetes.io/projected/89510618-03cf-4e1d-b8e3-ce053ca16669-kube-api-access-mc44l\") pod \"89510618-03cf-4e1d-b8e3-ce053ca16669\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " Nov 25 15:45:03 crc kubenswrapper[4965]: I1125 15:45:03.839358 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89510618-03cf-4e1d-b8e3-ce053ca16669-config-volume\") pod \"89510618-03cf-4e1d-b8e3-ce053ca16669\" (UID: \"89510618-03cf-4e1d-b8e3-ce053ca16669\") " Nov 25 15:45:03 crc kubenswrapper[4965]: I1125 15:45:03.840851 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89510618-03cf-4e1d-b8e3-ce053ca16669-config-volume" (OuterVolumeSpecName: "config-volume") pod "89510618-03cf-4e1d-b8e3-ce053ca16669" (UID: "89510618-03cf-4e1d-b8e3-ce053ca16669"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:45:03 crc kubenswrapper[4965]: I1125 15:45:03.842942 4965 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89510618-03cf-4e1d-b8e3-ce053ca16669-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:45:03 crc kubenswrapper[4965]: I1125 15:45:03.846381 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89510618-03cf-4e1d-b8e3-ce053ca16669-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "89510618-03cf-4e1d-b8e3-ce053ca16669" (UID: "89510618-03cf-4e1d-b8e3-ce053ca16669"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:45:03 crc kubenswrapper[4965]: I1125 15:45:03.861199 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89510618-03cf-4e1d-b8e3-ce053ca16669-kube-api-access-mc44l" (OuterVolumeSpecName: "kube-api-access-mc44l") pod "89510618-03cf-4e1d-b8e3-ce053ca16669" (UID: "89510618-03cf-4e1d-b8e3-ce053ca16669"). InnerVolumeSpecName "kube-api-access-mc44l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:45:03 crc kubenswrapper[4965]: I1125 15:45:03.944459 4965 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89510618-03cf-4e1d-b8e3-ce053ca16669-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:45:03 crc kubenswrapper[4965]: I1125 15:45:03.944506 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc44l\" (UniqueName: \"kubernetes.io/projected/89510618-03cf-4e1d-b8e3-ce053ca16669-kube-api-access-mc44l\") on node \"crc\" DevicePath \"\"" Nov 25 15:45:04 crc kubenswrapper[4965]: I1125 15:45:04.328257 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs"] Nov 25 15:45:04 crc kubenswrapper[4965]: I1125 15:45:04.336051 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401380-dlnxs"] Nov 25 15:45:04 crc kubenswrapper[4965]: I1125 15:45:04.502994 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" event={"ID":"89510618-03cf-4e1d-b8e3-ce053ca16669","Type":"ContainerDied","Data":"c2f43f865156b8a02c83218cfc4510132f16f2bf871eb062b99fc27e9b10efda"} Nov 25 15:45:04 crc kubenswrapper[4965]: I1125 15:45:04.503031 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-l7vb2" Nov 25 15:45:04 crc kubenswrapper[4965]: I1125 15:45:04.503038 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2f43f865156b8a02c83218cfc4510132f16f2bf871eb062b99fc27e9b10efda" Nov 25 15:45:04 crc kubenswrapper[4965]: I1125 15:45:04.782568 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a" path="/var/lib/kubelet/pods/3e2f6eb9-9c21-4f2f-8247-087a6aa40f3a/volumes" Nov 25 15:45:09 crc kubenswrapper[4965]: I1125 15:45:09.772684 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:45:09 crc kubenswrapper[4965]: E1125 15:45:09.775030 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:45:24 crc kubenswrapper[4965]: I1125 15:45:24.771682 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:45:24 crc kubenswrapper[4965]: E1125 15:45:24.773337 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:45:35 crc kubenswrapper[4965]: I1125 15:45:35.533834 4965 scope.go:117] "RemoveContainer" containerID="e20beb0854d305b63b3a390e6186a97cb6bcf36b71530a02d770432cf0f26b8c" Nov 25 15:45:39 crc kubenswrapper[4965]: I1125 15:45:39.771500 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:45:39 crc kubenswrapper[4965]: E1125 15:45:39.772496 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:45:52 crc kubenswrapper[4965]: I1125 15:45:52.772013 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:45:52 crc kubenswrapper[4965]: E1125 15:45:52.772727 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:46:07 crc kubenswrapper[4965]: I1125 15:46:07.772377 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:46:07 crc kubenswrapper[4965]: E1125 15:46:07.775538 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:46:18 crc kubenswrapper[4965]: I1125 15:46:18.771480 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:46:18 crc kubenswrapper[4965]: E1125 15:46:18.772281 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:46:23 crc kubenswrapper[4965]: I1125 15:46:23.993941 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5wld/must-gather-rkk4m"] Nov 25 15:46:23 crc kubenswrapper[4965]: E1125 15:46:23.994893 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89510618-03cf-4e1d-b8e3-ce053ca16669" containerName="collect-profiles" Nov 25 15:46:23 crc kubenswrapper[4965]: I1125 15:46:23.994908 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="89510618-03cf-4e1d-b8e3-ce053ca16669" containerName="collect-profiles" Nov 25 15:46:23 crc kubenswrapper[4965]: I1125 15:46:23.995120 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="89510618-03cf-4e1d-b8e3-ce053ca16669" containerName="collect-profiles" Nov 25 15:46:23 crc kubenswrapper[4965]: I1125 15:46:23.996132 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/must-gather-rkk4m" Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.000208 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f5wld"/"openshift-service-ca.crt" Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.007340 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f5wld"/"kube-root-ca.crt" Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.102866 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-must-gather-output\") pod \"must-gather-rkk4m\" (UID: \"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982\") " pod="openshift-must-gather-f5wld/must-gather-rkk4m" Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.102936 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98l8s\" (UniqueName: \"kubernetes.io/projected/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-kube-api-access-98l8s\") pod \"must-gather-rkk4m\" (UID: \"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982\") " pod="openshift-must-gather-f5wld/must-gather-rkk4m" Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.113836 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5wld/must-gather-rkk4m"] Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.205037 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-must-gather-output\") pod \"must-gather-rkk4m\" (UID: \"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982\") " pod="openshift-must-gather-f5wld/must-gather-rkk4m" Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.205406 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98l8s\" (UniqueName: \"kubernetes.io/projected/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-kube-api-access-98l8s\") pod \"must-gather-rkk4m\" (UID: \"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982\") " pod="openshift-must-gather-f5wld/must-gather-rkk4m" Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.205784 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-must-gather-output\") pod \"must-gather-rkk4m\" (UID: \"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982\") " pod="openshift-must-gather-f5wld/must-gather-rkk4m" Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.221923 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98l8s\" (UniqueName: \"kubernetes.io/projected/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-kube-api-access-98l8s\") pod \"must-gather-rkk4m\" (UID: \"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982\") " pod="openshift-must-gather-f5wld/must-gather-rkk4m" Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.331512 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/must-gather-rkk4m" Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.631741 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5wld/must-gather-rkk4m"] Nov 25 15:46:24 crc kubenswrapper[4965]: I1125 15:46:24.642552 4965 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:46:25 crc kubenswrapper[4965]: I1125 15:46:25.230767 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5wld/must-gather-rkk4m" event={"ID":"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982","Type":"ContainerStarted","Data":"27885e6510d694c286f36e49bdf51f33dc0bde939555795eece4640dccbc6562"} Nov 25 15:46:29 crc kubenswrapper[4965]: I1125 15:46:29.268453 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5wld/must-gather-rkk4m" event={"ID":"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982","Type":"ContainerStarted","Data":"4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989"} Nov 25 15:46:30 crc kubenswrapper[4965]: I1125 15:46:30.275234 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5wld/must-gather-rkk4m" event={"ID":"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982","Type":"ContainerStarted","Data":"69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d"} Nov 25 15:46:30 crc kubenswrapper[4965]: I1125 15:46:30.297380 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5wld/must-gather-rkk4m" podStartSLOduration=3.1608309390000002 podStartE2EDuration="7.297363082s" podCreationTimestamp="2025-11-25 15:46:23 +0000 UTC" firstStartedPulling="2025-11-25 15:46:24.642523624 +0000 UTC m=+2529.610117370" lastFinishedPulling="2025-11-25 15:46:28.779055767 +0000 UTC m=+2533.746649513" observedRunningTime="2025-11-25 15:46:30.2917909 +0000 UTC m=+2535.259384646" watchObservedRunningTime="2025-11-25 15:46:30.297363082 +0000 UTC m=+2535.264956828" Nov 25 15:46:32 crc kubenswrapper[4965]: E1125 15:46:32.232323 4965 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.176:48346->38.102.83.176:37963: write tcp 38.102.83.176:48346->38.102.83.176:37963: write: connection reset by peer Nov 25 15:46:32 crc kubenswrapper[4965]: I1125 15:46:32.730095 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5wld/crc-debug-hqw62"] Nov 25 15:46:32 crc kubenswrapper[4965]: I1125 15:46:32.731448 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/crc-debug-hqw62" Nov 25 15:46:32 crc kubenswrapper[4965]: I1125 15:46:32.734520 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f5wld"/"default-dockercfg-vkxwx" Nov 25 15:46:32 crc kubenswrapper[4965]: I1125 15:46:32.879244 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5ggj\" (UniqueName: \"kubernetes.io/projected/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-kube-api-access-d5ggj\") pod \"crc-debug-hqw62\" (UID: \"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea\") " pod="openshift-must-gather-f5wld/crc-debug-hqw62" Nov 25 15:46:32 crc kubenswrapper[4965]: I1125 15:46:32.880525 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-host\") pod \"crc-debug-hqw62\" (UID: \"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea\") " pod="openshift-must-gather-f5wld/crc-debug-hqw62" Nov 25 15:46:32 crc kubenswrapper[4965]: I1125 15:46:32.981804 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5ggj\" (UniqueName: \"kubernetes.io/projected/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-kube-api-access-d5ggj\") pod \"crc-debug-hqw62\" (UID: \"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea\") " pod="openshift-must-gather-f5wld/crc-debug-hqw62" Nov 25 15:46:32 crc kubenswrapper[4965]: I1125 15:46:32.981915 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-host\") pod \"crc-debug-hqw62\" (UID: \"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea\") " pod="openshift-must-gather-f5wld/crc-debug-hqw62" Nov 25 15:46:32 crc kubenswrapper[4965]: I1125 15:46:32.982167 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-host\") pod \"crc-debug-hqw62\" (UID: \"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea\") " pod="openshift-must-gather-f5wld/crc-debug-hqw62" Nov 25 15:46:33 crc kubenswrapper[4965]: I1125 15:46:33.010630 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5ggj\" (UniqueName: \"kubernetes.io/projected/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-kube-api-access-d5ggj\") pod \"crc-debug-hqw62\" (UID: \"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea\") " pod="openshift-must-gather-f5wld/crc-debug-hqw62" Nov 25 15:46:33 crc kubenswrapper[4965]: I1125 15:46:33.058542 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/crc-debug-hqw62" Nov 25 15:46:33 crc kubenswrapper[4965]: W1125 15:46:33.089570 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f3b57c2_f0a3_4137_8ca0_5c92cdf895ea.slice/crio-2ff714e727e573b9f213b986813fe8781044b22cc6302704ce8fd9f3fa34d0a6 WatchSource:0}: Error finding container 2ff714e727e573b9f213b986813fe8781044b22cc6302704ce8fd9f3fa34d0a6: Status 404 returned error can't find the container with id 2ff714e727e573b9f213b986813fe8781044b22cc6302704ce8fd9f3fa34d0a6 Nov 25 15:46:33 crc kubenswrapper[4965]: I1125 15:46:33.297224 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5wld/crc-debug-hqw62" event={"ID":"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea","Type":"ContainerStarted","Data":"2ff714e727e573b9f213b986813fe8781044b22cc6302704ce8fd9f3fa34d0a6"} Nov 25 15:46:33 crc kubenswrapper[4965]: I1125 15:46:33.772712 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:46:33 crc kubenswrapper[4965]: E1125 15:46:33.773489 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:46:44 crc kubenswrapper[4965]: I1125 15:46:44.393184 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5wld/crc-debug-hqw62" event={"ID":"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea","Type":"ContainerStarted","Data":"7322451c391900e4e01cef874957ca23d58761d0936c424754cc70706bd8f3ed"} Nov 25 15:46:44 crc kubenswrapper[4965]: I1125 15:46:44.423224 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5wld/crc-debug-hqw62" podStartSLOduration=1.85773842 podStartE2EDuration="12.423197488s" podCreationTimestamp="2025-11-25 15:46:32 +0000 UTC" firstStartedPulling="2025-11-25 15:46:33.091408281 +0000 UTC m=+2538.059002027" lastFinishedPulling="2025-11-25 15:46:43.656867349 +0000 UTC m=+2548.624461095" observedRunningTime="2025-11-25 15:46:44.40750715 +0000 UTC m=+2549.375100896" watchObservedRunningTime="2025-11-25 15:46:44.423197488 +0000 UTC m=+2549.390791274" Nov 25 15:46:46 crc kubenswrapper[4965]: I1125 15:46:46.784834 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:46:46 crc kubenswrapper[4965]: E1125 15:46:46.785480 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:46:57 crc kubenswrapper[4965]: I1125 15:46:57.772156 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:46:58 crc kubenswrapper[4965]: I1125 15:46:58.537038 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"6a1dca3b27555c377546c9201f462ced891b48f7a7c015e491a01ed360c56551"} Nov 25 15:47:05 crc kubenswrapper[4965]: I1125 15:47:05.590529 4965 generic.go:334] "Generic (PLEG): container finished" podID="7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea" containerID="7322451c391900e4e01cef874957ca23d58761d0936c424754cc70706bd8f3ed" exitCode=0 Nov 25 15:47:05 crc kubenswrapper[4965]: I1125 15:47:05.591071 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5wld/crc-debug-hqw62" event={"ID":"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea","Type":"ContainerDied","Data":"7322451c391900e4e01cef874957ca23d58761d0936c424754cc70706bd8f3ed"} Nov 25 15:47:06 crc kubenswrapper[4965]: I1125 15:47:06.724244 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/crc-debug-hqw62" Nov 25 15:47:06 crc kubenswrapper[4965]: I1125 15:47:06.769386 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f5wld/crc-debug-hqw62"] Nov 25 15:47:06 crc kubenswrapper[4965]: I1125 15:47:06.784726 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f5wld/crc-debug-hqw62"] Nov 25 15:47:06 crc kubenswrapper[4965]: I1125 15:47:06.845451 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5ggj\" (UniqueName: \"kubernetes.io/projected/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-kube-api-access-d5ggj\") pod \"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea\" (UID: \"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea\") " Nov 25 15:47:06 crc kubenswrapper[4965]: I1125 15:47:06.845841 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-host\") pod \"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea\" (UID: \"7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea\") " Nov 25 15:47:06 crc kubenswrapper[4965]: I1125 15:47:06.846029 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-host" (OuterVolumeSpecName: "host") pod "7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea" (UID: "7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:47:06 crc kubenswrapper[4965]: I1125 15:47:06.846616 4965 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-host\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:06 crc kubenswrapper[4965]: I1125 15:47:06.851145 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-kube-api-access-d5ggj" (OuterVolumeSpecName: "kube-api-access-d5ggj") pod "7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea" (UID: "7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea"). InnerVolumeSpecName "kube-api-access-d5ggj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:47:06 crc kubenswrapper[4965]: I1125 15:47:06.948717 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5ggj\" (UniqueName: \"kubernetes.io/projected/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea-kube-api-access-d5ggj\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:07 crc kubenswrapper[4965]: I1125 15:47:07.608663 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ff714e727e573b9f213b986813fe8781044b22cc6302704ce8fd9f3fa34d0a6" Nov 25 15:47:07 crc kubenswrapper[4965]: I1125 15:47:07.608686 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/crc-debug-hqw62" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.013645 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5wld/crc-debug-p96cx"] Nov 25 15:47:08 crc kubenswrapper[4965]: E1125 15:47:08.014057 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea" containerName="container-00" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.014070 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea" containerName="container-00" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.014274 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea" containerName="container-00" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.014829 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/crc-debug-p96cx" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.020178 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f5wld"/"default-dockercfg-vkxwx" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.170679 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fh4v\" (UniqueName: \"kubernetes.io/projected/3f7ba254-50df-4aa2-8ca8-578456aa87bb-kube-api-access-2fh4v\") pod \"crc-debug-p96cx\" (UID: \"3f7ba254-50df-4aa2-8ca8-578456aa87bb\") " pod="openshift-must-gather-f5wld/crc-debug-p96cx" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.171048 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f7ba254-50df-4aa2-8ca8-578456aa87bb-host\") pod \"crc-debug-p96cx\" (UID: \"3f7ba254-50df-4aa2-8ca8-578456aa87bb\") " pod="openshift-must-gather-f5wld/crc-debug-p96cx" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.272260 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f7ba254-50df-4aa2-8ca8-578456aa87bb-host\") pod \"crc-debug-p96cx\" (UID: \"3f7ba254-50df-4aa2-8ca8-578456aa87bb\") " pod="openshift-must-gather-f5wld/crc-debug-p96cx" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.272339 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f7ba254-50df-4aa2-8ca8-578456aa87bb-host\") pod \"crc-debug-p96cx\" (UID: \"3f7ba254-50df-4aa2-8ca8-578456aa87bb\") " pod="openshift-must-gather-f5wld/crc-debug-p96cx" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.272414 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fh4v\" (UniqueName: \"kubernetes.io/projected/3f7ba254-50df-4aa2-8ca8-578456aa87bb-kube-api-access-2fh4v\") pod \"crc-debug-p96cx\" (UID: \"3f7ba254-50df-4aa2-8ca8-578456aa87bb\") " pod="openshift-must-gather-f5wld/crc-debug-p96cx" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.293045 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fh4v\" (UniqueName: \"kubernetes.io/projected/3f7ba254-50df-4aa2-8ca8-578456aa87bb-kube-api-access-2fh4v\") pod \"crc-debug-p96cx\" (UID: \"3f7ba254-50df-4aa2-8ca8-578456aa87bb\") " pod="openshift-must-gather-f5wld/crc-debug-p96cx" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.328631 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/crc-debug-p96cx" Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.618908 4965 generic.go:334] "Generic (PLEG): container finished" podID="3f7ba254-50df-4aa2-8ca8-578456aa87bb" containerID="020cc4e799eb3286bf81a0035438b3c043a75699c7820d7056b3bbe11b4a4843" exitCode=1 Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.619010 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5wld/crc-debug-p96cx" event={"ID":"3f7ba254-50df-4aa2-8ca8-578456aa87bb","Type":"ContainerDied","Data":"020cc4e799eb3286bf81a0035438b3c043a75699c7820d7056b3bbe11b4a4843"} Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.619644 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5wld/crc-debug-p96cx" event={"ID":"3f7ba254-50df-4aa2-8ca8-578456aa87bb","Type":"ContainerStarted","Data":"746c105c539bf9fca4f191e4d8129c398a4978b1ec551ecd828f0c56d04fdb3c"} Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.681683 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f5wld/crc-debug-p96cx"] Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.693117 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f5wld/crc-debug-p96cx"] Nov 25 15:47:08 crc kubenswrapper[4965]: I1125 15:47:08.789943 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea" path="/var/lib/kubelet/pods/7f3b57c2-f0a3-4137-8ca0-5c92cdf895ea/volumes" Nov 25 15:47:09 crc kubenswrapper[4965]: I1125 15:47:09.748062 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/crc-debug-p96cx" Nov 25 15:47:09 crc kubenswrapper[4965]: I1125 15:47:09.904505 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f7ba254-50df-4aa2-8ca8-578456aa87bb-host\") pod \"3f7ba254-50df-4aa2-8ca8-578456aa87bb\" (UID: \"3f7ba254-50df-4aa2-8ca8-578456aa87bb\") " Nov 25 15:47:09 crc kubenswrapper[4965]: I1125 15:47:09.904598 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f7ba254-50df-4aa2-8ca8-578456aa87bb-host" (OuterVolumeSpecName: "host") pod "3f7ba254-50df-4aa2-8ca8-578456aa87bb" (UID: "3f7ba254-50df-4aa2-8ca8-578456aa87bb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:47:09 crc kubenswrapper[4965]: I1125 15:47:09.904781 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fh4v\" (UniqueName: \"kubernetes.io/projected/3f7ba254-50df-4aa2-8ca8-578456aa87bb-kube-api-access-2fh4v\") pod \"3f7ba254-50df-4aa2-8ca8-578456aa87bb\" (UID: \"3f7ba254-50df-4aa2-8ca8-578456aa87bb\") " Nov 25 15:47:09 crc kubenswrapper[4965]: I1125 15:47:09.906197 4965 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f7ba254-50df-4aa2-8ca8-578456aa87bb-host\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:09 crc kubenswrapper[4965]: I1125 15:47:09.917328 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7ba254-50df-4aa2-8ca8-578456aa87bb-kube-api-access-2fh4v" (OuterVolumeSpecName: "kube-api-access-2fh4v") pod "3f7ba254-50df-4aa2-8ca8-578456aa87bb" (UID: "3f7ba254-50df-4aa2-8ca8-578456aa87bb"). InnerVolumeSpecName "kube-api-access-2fh4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:47:10 crc kubenswrapper[4965]: I1125 15:47:10.008303 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fh4v\" (UniqueName: \"kubernetes.io/projected/3f7ba254-50df-4aa2-8ca8-578456aa87bb-kube-api-access-2fh4v\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:10 crc kubenswrapper[4965]: I1125 15:47:10.636273 4965 scope.go:117] "RemoveContainer" containerID="020cc4e799eb3286bf81a0035438b3c043a75699c7820d7056b3bbe11b4a4843" Nov 25 15:47:10 crc kubenswrapper[4965]: I1125 15:47:10.636305 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/crc-debug-p96cx" Nov 25 15:47:10 crc kubenswrapper[4965]: I1125 15:47:10.789331 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7ba254-50df-4aa2-8ca8-578456aa87bb" path="/var/lib/kubelet/pods/3f7ba254-50df-4aa2-8ca8-578456aa87bb/volumes" Nov 25 15:47:55 crc kubenswrapper[4965]: I1125 15:47:55.445454 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6fdd878f78-r9vjx_7ab61fa8-7198-48ff-920a-39246bf7d752/barbican-api/0.log" Nov 25 15:47:55 crc kubenswrapper[4965]: I1125 15:47:55.457294 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6fdd878f78-r9vjx_7ab61fa8-7198-48ff-920a-39246bf7d752/barbican-api-log/0.log" Nov 25 15:47:55 crc kubenswrapper[4965]: I1125 15:47:55.692425 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bf85454d4-vsrlb_6a6c4bed-60b7-41ee-b4b0-412bb3e25989/barbican-keystone-listener/0.log" Nov 25 15:47:55 crc kubenswrapper[4965]: I1125 15:47:55.744297 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bf85454d4-vsrlb_6a6c4bed-60b7-41ee-b4b0-412bb3e25989/barbican-keystone-listener-log/0.log" Nov 25 15:47:55 crc kubenswrapper[4965]: I1125 15:47:55.826105 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b86cffbd7-spglj_7960ffe1-ecc7-4e83-9255-8f57e8707289/barbican-worker/0.log" Nov 25 15:47:55 crc kubenswrapper[4965]: I1125 15:47:55.925578 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b86cffbd7-spglj_7960ffe1-ecc7-4e83-9255-8f57e8707289/barbican-worker-log/0.log" Nov 25 15:47:55 crc kubenswrapper[4965]: I1125 15:47:55.966146 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-9zxp5_fbce69a8-d42e-498d-bbb8-7d98e9b1790e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 15:47:56 crc kubenswrapper[4965]: I1125 15:47:56.418689 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_db88222c-47b1-4187-9794-50f067ffdc89/proxy-httpd/0.log" Nov 25 15:47:56 crc kubenswrapper[4965]: I1125 15:47:56.454890 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_db88222c-47b1-4187-9794-50f067ffdc89/ceilometer-central-agent/0.log" Nov 25 15:47:56 crc kubenswrapper[4965]: I1125 15:47:56.457881 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_db88222c-47b1-4187-9794-50f067ffdc89/ceilometer-notification-agent/0.log" Nov 25 15:47:56 crc kubenswrapper[4965]: I1125 15:47:56.531380 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_db88222c-47b1-4187-9794-50f067ffdc89/sg-core/0.log" Nov 25 15:47:56 crc kubenswrapper[4965]: I1125 15:47:56.654400 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ssd52_756d59ef-4e5c-4a8c-8295-67ecbb70e81e/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 15:47:56 crc kubenswrapper[4965]: I1125 15:47:56.849284 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7a7b2938-42cd-4dee-b5fe-2c85b4bea92f/cinder-api/0.log" Nov 25 15:47:56 crc kubenswrapper[4965]: I1125 15:47:56.906745 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7a7b2938-42cd-4dee-b5fe-2c85b4bea92f/cinder-api-log/0.log" Nov 25 15:47:57 crc kubenswrapper[4965]: I1125 15:47:57.064745 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_daa918e9-374b-4302-910d-496d7a0a746c/cinder-scheduler/0.log" Nov 25 15:47:57 crc kubenswrapper[4965]: I1125 15:47:57.114191 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_daa918e9-374b-4302-910d-496d7a0a746c/probe/0.log" Nov 25 15:47:57 crc kubenswrapper[4965]: I1125 15:47:57.340543 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5hrvk_ec41d4aa-6d2e-4068-82a9-18fd1489899b/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 15:47:57 crc kubenswrapper[4965]: I1125 15:47:57.342019 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xvrfn_4e699efe-dd84-4db4-aa1c-a47a22e55f5f/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 15:47:57 crc kubenswrapper[4965]: I1125 15:47:57.622375 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c5d8cf46f-msf8j_082f410e-8793-4651-be56-a0c486eebdbc/init/0.log" Nov 25 15:47:57 crc kubenswrapper[4965]: I1125 15:47:57.744489 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c5d8cf46f-msf8j_082f410e-8793-4651-be56-a0c486eebdbc/init/0.log" Nov 25 15:47:57 crc kubenswrapper[4965]: I1125 15:47:57.874421 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-trs8x_71388e9b-27dd-4af7-aa67-4330d051a98d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 15:47:57 crc kubenswrapper[4965]: I1125 15:47:57.902807 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c5d8cf46f-msf8j_082f410e-8793-4651-be56-a0c486eebdbc/dnsmasq-dns/0.log" Nov 25 15:47:58 crc kubenswrapper[4965]: I1125 15:47:58.077323 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6b6574b966-rvvgn_137be586-5f9e-4e81-a676-b6c30c501608/keystone-api/0.log" Nov 25 15:47:58 crc kubenswrapper[4965]: I1125 15:47:58.163022 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_54bc32ba-400a-4cc6-a7a0-c03eb66edd9d/kube-state-metrics/0.log" Nov 25 15:47:58 crc kubenswrapper[4965]: I1125 15:47:58.512275 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-754f5b77b5-wngzm_b65d1464-decb-4a38-8d9c-863605da10e1/neutron-api/0.log" Nov 25 15:47:58 crc kubenswrapper[4965]: I1125 15:47:58.645517 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-754f5b77b5-wngzm_b65d1464-decb-4a38-8d9c-863605da10e1/neutron-httpd/0.log" Nov 25 15:47:59 crc kubenswrapper[4965]: I1125 15:47:59.137517 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3e161c28-2cef-473a-bc7f-88e13dbb55c3/nova-api-log/0.log" Nov 25 15:47:59 crc kubenswrapper[4965]: I1125 15:47:59.192308 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3e161c28-2cef-473a-bc7f-88e13dbb55c3/nova-api-api/0.log" Nov 25 15:47:59 crc kubenswrapper[4965]: I1125 15:47:59.673607 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bcd74f74-0867-4dd1-a28e-18cd43f2b3f6/nova-cell0-conductor-conductor/0.log" Nov 25 15:47:59 crc kubenswrapper[4965]: I1125 15:47:59.922420 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_dfb126d7-e0c4-4273-9ede-3a1425a6e36c/nova-cell1-conductor-conductor/0.log" Nov 25 15:48:00 crc kubenswrapper[4965]: I1125 15:48:00.024011 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_90d758e5-3565-44c4-9243-6f331e5eabf0/nova-cell1-novncproxy-novncproxy/0.log" Nov 25 15:48:00 crc kubenswrapper[4965]: I1125 15:48:00.325511 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_009897a7-5f6c-44a3-8076-262d3f946ae9/nova-metadata-log/0.log" Nov 25 15:48:00 crc kubenswrapper[4965]: I1125 15:48:00.699646 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_11c417a7-1f7b-42c4-ba2d-e221bdf95f9f/mysql-bootstrap/0.log" Nov 25 15:48:00 crc kubenswrapper[4965]: I1125 15:48:00.835023 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bfe5968d-c0b6-4e22-802d-4d36f89347db/nova-scheduler-scheduler/0.log" Nov 25 15:48:00 crc kubenswrapper[4965]: I1125 15:48:00.908874 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_009897a7-5f6c-44a3-8076-262d3f946ae9/nova-metadata-metadata/0.log" Nov 25 15:48:01 crc kubenswrapper[4965]: I1125 15:48:01.004202 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_11c417a7-1f7b-42c4-ba2d-e221bdf95f9f/mysql-bootstrap/0.log" Nov 25 15:48:01 crc kubenswrapper[4965]: I1125 15:48:01.012909 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_11c417a7-1f7b-42c4-ba2d-e221bdf95f9f/galera/0.log" Nov 25 15:48:01 crc kubenswrapper[4965]: I1125 15:48:01.241220 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_362085ca-1948-4f56-8add-3e727c63e58e/mysql-bootstrap/0.log" Nov 25 15:48:01 crc kubenswrapper[4965]: I1125 15:48:01.510840 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_362085ca-1948-4f56-8add-3e727c63e58e/mysql-bootstrap/0.log" Nov 25 15:48:01 crc kubenswrapper[4965]: I1125 15:48:01.519492 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_edff109d-255f-4c31-a010-896ca2068559/openstackclient/0.log" Nov 25 15:48:01 crc kubenswrapper[4965]: I1125 15:48:01.548885 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_362085ca-1948-4f56-8add-3e727c63e58e/galera/0.log" Nov 25 15:48:01 crc kubenswrapper[4965]: I1125 15:48:01.768004 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xk9ml_10034cec-97f3-4270-a5ab-e6b589e6ac13/openstack-network-exporter/0.log" Nov 25 15:48:01 crc kubenswrapper[4965]: I1125 15:48:01.907152 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwsx2_63b33f63-d2e7-48cd-92e3-f47404184ba9/ovsdb-server-init/0.log" Nov 25 15:48:02 crc kubenswrapper[4965]: I1125 15:48:02.160355 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwsx2_63b33f63-d2e7-48cd-92e3-f47404184ba9/ovsdb-server/0.log" Nov 25 15:48:02 crc kubenswrapper[4965]: I1125 15:48:02.253070 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwsx2_63b33f63-d2e7-48cd-92e3-f47404184ba9/ovs-vswitchd/0.log" Nov 25 15:48:02 crc kubenswrapper[4965]: I1125 15:48:02.281429 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwsx2_63b33f63-d2e7-48cd-92e3-f47404184ba9/ovsdb-server-init/0.log" Nov 25 15:48:02 crc kubenswrapper[4965]: I1125 15:48:02.524731 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4e3054ed-5cc4-4dce-9b59-72ff19700b27/openstack-network-exporter/0.log" Nov 25 15:48:02 crc kubenswrapper[4965]: I1125 15:48:02.549327 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wscwk_b0dd84e2-a8c8-4cc3-a5a5-deca4210a8ed/ovn-controller/0.log" Nov 25 15:48:02 crc kubenswrapper[4965]: I1125 15:48:02.566714 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4e3054ed-5cc4-4dce-9b59-72ff19700b27/ovn-northd/0.log" Nov 25 15:48:02 crc kubenswrapper[4965]: I1125 15:48:02.823370 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_929774db-0294-4631-b00e-1b664c1d4cba/openstack-network-exporter/0.log" Nov 25 15:48:02 crc kubenswrapper[4965]: I1125 15:48:02.852480 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_929774db-0294-4631-b00e-1b664c1d4cba/ovsdbserver-nb/0.log" Nov 25 15:48:03 crc kubenswrapper[4965]: I1125 15:48:03.343091 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8ea3b8e7-7b5d-46e0-b07d-33db65d5305d/openstack-network-exporter/0.log" Nov 25 15:48:03 crc kubenswrapper[4965]: I1125 15:48:03.407258 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8ea3b8e7-7b5d-46e0-b07d-33db65d5305d/ovsdbserver-sb/0.log" Nov 25 15:48:03 crc kubenswrapper[4965]: I1125 15:48:03.573060 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55bb9cdb94-946lv_58f31396-3b21-4e2e-981e-32196692ab5d/placement-api/0.log" Nov 25 15:48:03 crc kubenswrapper[4965]: I1125 15:48:03.724732 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9a811059-77da-436c-95e6-fddf5baa649c/setup-container/0.log" Nov 25 15:48:03 crc kubenswrapper[4965]: I1125 15:48:03.732825 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55bb9cdb94-946lv_58f31396-3b21-4e2e-981e-32196692ab5d/placement-log/0.log" Nov 25 15:48:04 crc kubenswrapper[4965]: I1125 15:48:04.136479 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9a811059-77da-436c-95e6-fddf5baa649c/setup-container/0.log" Nov 25 15:48:04 crc kubenswrapper[4965]: I1125 15:48:04.158938 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9a811059-77da-436c-95e6-fddf5baa649c/rabbitmq/0.log" Nov 25 15:48:04 crc kubenswrapper[4965]: I1125 15:48:04.185562 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d5400ed8-9880-47b3-b8e7-5de35a2c7e00/setup-container/0.log" Nov 25 15:48:04 crc kubenswrapper[4965]: I1125 15:48:04.462024 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-l7vsz_9f4689ce-557b-4194-9502-fe642064225e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 15:48:04 crc kubenswrapper[4965]: I1125 15:48:04.472219 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d5400ed8-9880-47b3-b8e7-5de35a2c7e00/rabbitmq/0.log" Nov 25 15:48:04 crc kubenswrapper[4965]: I1125 15:48:04.515418 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d5400ed8-9880-47b3-b8e7-5de35a2c7e00/setup-container/0.log" Nov 25 15:48:04 crc kubenswrapper[4965]: I1125 15:48:04.862915 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-czlzd_d808d3ad-65f1-4019-a1d1-5d0b9afac8c2/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 15:48:04 crc kubenswrapper[4965]: I1125 15:48:04.918138 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9zx8m_158e33b0-a941-4486-af32-c561ca8d32db/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 15:48:05 crc kubenswrapper[4965]: I1125 15:48:05.059062 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-sd7b2_c6c05a6e-b7d5-4906-96b7-713e13170260/ssh-known-hosts-edpm-deployment/0.log" Nov 25 15:48:05 crc kubenswrapper[4965]: I1125 15:48:05.070667 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3d840d31-e83e-45b7-9863-1e747d7a1290/memcached/0.log" Nov 25 15:48:05 crc kubenswrapper[4965]: I1125 15:48:05.232557 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-fkffw_64f9c1f6-1dd0-4259-962c-637af6b3cf0f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 15:48:27 crc kubenswrapper[4965]: I1125 15:48:27.313010 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw_3d3f08fa-5e27-46b5-b0dd-76e9860c0729/util/0.log" Nov 25 15:48:27 crc kubenswrapper[4965]: I1125 15:48:27.521726 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw_3d3f08fa-5e27-46b5-b0dd-76e9860c0729/util/0.log" Nov 25 15:48:27 crc kubenswrapper[4965]: I1125 15:48:27.972837 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw_3d3f08fa-5e27-46b5-b0dd-76e9860c0729/pull/0.log" Nov 25 15:48:27 crc kubenswrapper[4965]: I1125 15:48:27.972850 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw_3d3f08fa-5e27-46b5-b0dd-76e9860c0729/pull/0.log" Nov 25 15:48:27 crc kubenswrapper[4965]: I1125 15:48:27.972929 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw_3d3f08fa-5e27-46b5-b0dd-76e9860c0729/pull/0.log" Nov 25 15:48:28 crc kubenswrapper[4965]: I1125 15:48:28.128995 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw_3d3f08fa-5e27-46b5-b0dd-76e9860c0729/extract/0.log" Nov 25 15:48:28 crc kubenswrapper[4965]: I1125 15:48:28.202848 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b547be2eac4c3e8f15e2800f340fb44715065f09b75b108844ab9dee6mt4jw_3d3f08fa-5e27-46b5-b0dd-76e9860c0729/util/0.log" Nov 25 15:48:28 crc kubenswrapper[4965]: I1125 15:48:28.240367 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-7k79p_34390094-977c-4d9b-a9dd-8f4d4a5a89ad/kube-rbac-proxy/0.log" Nov 25 15:48:28 crc kubenswrapper[4965]: I1125 15:48:28.424378 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-wt2wg_764687bc-d3b6-47b3-96d8-8c31f47ab473/kube-rbac-proxy/0.log" Nov 25 15:48:28 crc kubenswrapper[4965]: I1125 15:48:28.451244 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-7k79p_34390094-977c-4d9b-a9dd-8f4d4a5a89ad/manager/0.log" Nov 25 15:48:28 crc kubenswrapper[4965]: I1125 15:48:28.527381 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-wt2wg_764687bc-d3b6-47b3-96d8-8c31f47ab473/manager/0.log" Nov 25 15:48:28 crc kubenswrapper[4965]: I1125 15:48:28.721084 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-7nxmk_dbc985bf-ffef-456f-b4bd-37faeba9e8a1/manager/0.log" Nov 25 15:48:28 crc kubenswrapper[4965]: I1125 15:48:28.743234 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-7nxmk_dbc985bf-ffef-456f-b4bd-37faeba9e8a1/kube-rbac-proxy/0.log" Nov 25 15:48:28 crc kubenswrapper[4965]: I1125 15:48:28.934475 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-kdff2_579b7594-cdbd-4b63-9405-0321a133d2d0/kube-rbac-proxy/0.log" Nov 25 15:48:28 crc kubenswrapper[4965]: I1125 15:48:28.972387 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-kdff2_579b7594-cdbd-4b63-9405-0321a133d2d0/manager/0.log" Nov 25 15:48:29 crc kubenswrapper[4965]: I1125 15:48:29.106175 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-7fj92_b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4/kube-rbac-proxy/0.log" Nov 25 15:48:29 crc kubenswrapper[4965]: I1125 15:48:29.162158 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-7fj92_b4a5bfc4-1ea9-4a7b-b1f2-095e3ee468d4/manager/0.log" Nov 25 15:48:29 crc kubenswrapper[4965]: I1125 15:48:29.225782 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-v96z8_756bbaba-31b3-4cd8-b3a6-6a3e0b805261/kube-rbac-proxy/0.log" Nov 25 15:48:29 crc kubenswrapper[4965]: I1125 15:48:29.296744 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-v96z8_756bbaba-31b3-4cd8-b3a6-6a3e0b805261/manager/0.log" Nov 25 15:48:29 crc kubenswrapper[4965]: I1125 15:48:29.407426 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-d5dnx_a3dd58f4-c4d6-43dc-b9fa-78d464337376/kube-rbac-proxy/0.log" Nov 25 15:48:29 crc kubenswrapper[4965]: I1125 15:48:29.580059 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-d5dnx_a3dd58f4-c4d6-43dc-b9fa-78d464337376/manager/0.log" Nov 25 15:48:29 crc kubenswrapper[4965]: I1125 15:48:29.649898 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-f6vxm_7fca11c0-cc43-457e-a797-610c31c9bc7f/kube-rbac-proxy/0.log" Nov 25 15:48:29 crc kubenswrapper[4965]: I1125 15:48:29.678016 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-f6vxm_7fca11c0-cc43-457e-a797-610c31c9bc7f/manager/0.log" Nov 25 15:48:29 crc kubenswrapper[4965]: I1125 15:48:29.831338 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-24796_90d4de2d-51f0-4b18-8272-905b733fc714/kube-rbac-proxy/0.log" Nov 25 15:48:29 crc kubenswrapper[4965]: I1125 15:48:29.921333 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-24796_90d4de2d-51f0-4b18-8272-905b733fc714/manager/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.015995 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-77tlb_6493c01f-7b22-4a04-9b25-b17ad7c790a1/kube-rbac-proxy/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.071087 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-77tlb_6493c01f-7b22-4a04-9b25-b17ad7c790a1/manager/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.159269 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-j9tml_af96aeb0-49ef-430d-9780-791c7a1b64da/kube-rbac-proxy/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.266826 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-j9tml_af96aeb0-49ef-430d-9780-791c7a1b64da/manager/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.368814 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-cf9pl_2b7be07d-fe11-494c-97b3-fa95b997450f/kube-rbac-proxy/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.422023 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-cf9pl_2b7be07d-fe11-494c-97b3-fa95b997450f/manager/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.584493 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-8xprb_f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a/kube-rbac-proxy/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.736855 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-8xprb_f1263806-1d34-4aa8-a5c0-1b8d3db7fb4a/manager/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.852267 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-6hhqt_09d0b7cc-6fc4-40dd-a332-b405d049e756/kube-rbac-proxy/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.907430 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-6hhqt_09d0b7cc-6fc4-40dd-a332-b405d049e756/manager/0.log" Nov 25 15:48:30 crc kubenswrapper[4965]: I1125 15:48:30.990152 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5_d781d742-fdc4-4480-90a3-6330b4add384/kube-rbac-proxy/0.log" Nov 25 15:48:31 crc kubenswrapper[4965]: I1125 15:48:31.036683 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-544b9bb9-9j5p5_d781d742-fdc4-4480-90a3-6330b4add384/manager/0.log" Nov 25 15:48:31 crc kubenswrapper[4965]: I1125 15:48:31.497498 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6d88ccc4fc-wz2ts_716065f2-f9f2-41fd-a193-3e815b38456e/operator/0.log" Nov 25 15:48:31 crc kubenswrapper[4965]: I1125 15:48:31.538283 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xx52h_c4e856d6-2a93-44e9-81c8-c965842a65d9/registry-server/0.log" Nov 25 15:48:31 crc kubenswrapper[4965]: I1125 15:48:31.794371 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-fwc8g_0952a381-bcc9-46de-bdac-bf2bdfe6ecc4/kube-rbac-proxy/0.log" Nov 25 15:48:31 crc kubenswrapper[4965]: I1125 15:48:31.965203 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-68587559f4-7lqhq_d8bdaece-696d-4306-a66b-46c7333eb788/manager/0.log" Nov 25 15:48:31 crc kubenswrapper[4965]: I1125 15:48:31.970242 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-fwc8g_0952a381-bcc9-46de-bdac-bf2bdfe6ecc4/manager/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.037864 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-7fzrj_cd724754-7539-4700-9911-5c0ce503d70f/manager/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.044677 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-7fzrj_cd724754-7539-4700-9911-5c0ce503d70f/kube-rbac-proxy/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.223482 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-kckh6_a5374ce4-8ac3-422b-9d62-9412dea697d3/operator/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.277613 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-hg488_3ae7668e-2e54-482f-9340-8ffe413de1d1/kube-rbac-proxy/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.320199 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-hg488_3ae7668e-2e54-482f-9340-8ffe413de1d1/manager/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.424674 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-xxb4h_31a52118-75f6-4e53-a6b6-fd6378c61df8/kube-rbac-proxy/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.550366 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-wmm8k_b85374cc-6464-4dd6-9c38-0cabb8fd7834/kube-rbac-proxy/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.560709 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-xxb4h_31a52118-75f6-4e53-a6b6-fd6378c61df8/manager/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.604588 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-wmm8k_b85374cc-6464-4dd6-9c38-0cabb8fd7834/manager/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.745288 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-k6bww_cae07c7e-b337-46bb-8b04-06643ee9e6c3/manager/0.log" Nov 25 15:48:32 crc kubenswrapper[4965]: I1125 15:48:32.764358 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-k6bww_cae07c7e-b337-46bb-8b04-06643ee9e6c3/kube-rbac-proxy/0.log" Nov 25 15:48:50 crc kubenswrapper[4965]: I1125 15:48:50.707489 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ksgsk_371d4f01-4337-4da2-8e72-7a79d2a7f98c/control-plane-machine-set-operator/0.log" Nov 25 15:48:50 crc kubenswrapper[4965]: I1125 15:48:50.916438 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b6gl2_317839b7-786d-4e93-8b37-4dd23e4a5032/kube-rbac-proxy/0.log" Nov 25 15:48:50 crc kubenswrapper[4965]: I1125 15:48:50.963526 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b6gl2_317839b7-786d-4e93-8b37-4dd23e4a5032/machine-api-operator/0.log" Nov 25 15:49:03 crc kubenswrapper[4965]: I1125 15:49:03.110126 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-f8xjq_d8569680-d671-4080-b2ce-ce6e9f858342/cert-manager-controller/0.log" Nov 25 15:49:03 crc kubenswrapper[4965]: I1125 15:49:03.315462 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-fwgnb_ae69f1e4-8cbc-45b6-bd5a-ddd31ebde275/cert-manager-cainjector/0.log" Nov 25 15:49:03 crc kubenswrapper[4965]: I1125 15:49:03.370560 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-qrbzh_4be52a9a-2bf2-4d74-8c23-ca6825be424a/cert-manager-webhook/0.log" Nov 25 15:49:15 crc kubenswrapper[4965]: I1125 15:49:15.227403 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-hn6qn_04e7005e-527f-4328-8981-7614b841a91b/nmstate-console-plugin/0.log" Nov 25 15:49:15 crc kubenswrapper[4965]: I1125 15:49:15.432989 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-z4cdf_afba2902-5375-4150-a501-282d517200e3/nmstate-handler/0.log" Nov 25 15:49:15 crc kubenswrapper[4965]: I1125 15:49:15.465787 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-9c85f_c72314c6-eff3-4dd6-aa24-b6831b35580f/kube-rbac-proxy/0.log" Nov 25 15:49:15 crc kubenswrapper[4965]: I1125 15:49:15.518199 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-9c85f_c72314c6-eff3-4dd6-aa24-b6831b35580f/nmstate-metrics/0.log" Nov 25 15:49:15 crc kubenswrapper[4965]: I1125 15:49:15.674897 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-mdzg9_8bfc4cc3-37e8-4034-89fe-922f3d3fd12d/nmstate-operator/0.log" Nov 25 15:49:15 crc kubenswrapper[4965]: I1125 15:49:15.758668 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-7sxnc_c59704bb-64d1-4282-ad60-648655cf9bf3/nmstate-webhook/0.log" Nov 25 15:49:19 crc kubenswrapper[4965]: I1125 15:49:19.872955 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvxp4"] Nov 25 15:49:19 crc kubenswrapper[4965]: E1125 15:49:19.874891 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7ba254-50df-4aa2-8ca8-578456aa87bb" containerName="container-00" Nov 25 15:49:19 crc kubenswrapper[4965]: I1125 15:49:19.874907 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7ba254-50df-4aa2-8ca8-578456aa87bb" containerName="container-00" Nov 25 15:49:19 crc kubenswrapper[4965]: I1125 15:49:19.876046 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7ba254-50df-4aa2-8ca8-578456aa87bb" containerName="container-00" Nov 25 15:49:19 crc kubenswrapper[4965]: I1125 15:49:19.881118 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:19 crc kubenswrapper[4965]: I1125 15:49:19.913694 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvxp4"] Nov 25 15:49:19 crc kubenswrapper[4965]: I1125 15:49:19.986035 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vclgb\" (UniqueName: \"kubernetes.io/projected/709fa9cc-682f-46aa-932e-52e54a00f4f8-kube-api-access-vclgb\") pod \"redhat-operators-wvxp4\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:19 crc kubenswrapper[4965]: I1125 15:49:19.986146 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-utilities\") pod \"redhat-operators-wvxp4\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:19 crc kubenswrapper[4965]: I1125 15:49:19.986180 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-catalog-content\") pod \"redhat-operators-wvxp4\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:20 crc kubenswrapper[4965]: I1125 15:49:20.087854 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-utilities\") pod \"redhat-operators-wvxp4\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:20 crc kubenswrapper[4965]: I1125 15:49:20.087913 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-catalog-content\") pod \"redhat-operators-wvxp4\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:20 crc kubenswrapper[4965]: I1125 15:49:20.088015 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vclgb\" (UniqueName: \"kubernetes.io/projected/709fa9cc-682f-46aa-932e-52e54a00f4f8-kube-api-access-vclgb\") pod \"redhat-operators-wvxp4\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:20 crc kubenswrapper[4965]: I1125 15:49:20.088488 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-utilities\") pod \"redhat-operators-wvxp4\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:20 crc kubenswrapper[4965]: I1125 15:49:20.088540 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-catalog-content\") pod \"redhat-operators-wvxp4\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:20 crc kubenswrapper[4965]: I1125 15:49:20.107798 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vclgb\" (UniqueName: \"kubernetes.io/projected/709fa9cc-682f-46aa-932e-52e54a00f4f8-kube-api-access-vclgb\") pod \"redhat-operators-wvxp4\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:20 crc kubenswrapper[4965]: I1125 15:49:20.233307 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:20 crc kubenswrapper[4965]: I1125 15:49:20.715907 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvxp4"] Nov 25 15:49:20 crc kubenswrapper[4965]: W1125 15:49:20.736946 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod709fa9cc_682f_46aa_932e_52e54a00f4f8.slice/crio-c2f8a1c310d1b24b4203efe63c0d743ff44a37ac68c306424d7c9cfb10c8f06b WatchSource:0}: Error finding container c2f8a1c310d1b24b4203efe63c0d743ff44a37ac68c306424d7c9cfb10c8f06b: Status 404 returned error can't find the container with id c2f8a1c310d1b24b4203efe63c0d743ff44a37ac68c306424d7c9cfb10c8f06b Nov 25 15:49:21 crc kubenswrapper[4965]: I1125 15:49:21.701604 4965 generic.go:334] "Generic (PLEG): container finished" podID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerID="92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92" exitCode=0 Nov 25 15:49:21 crc kubenswrapper[4965]: I1125 15:49:21.701649 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxp4" event={"ID":"709fa9cc-682f-46aa-932e-52e54a00f4f8","Type":"ContainerDied","Data":"92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92"} Nov 25 15:49:21 crc kubenswrapper[4965]: I1125 15:49:21.701896 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxp4" event={"ID":"709fa9cc-682f-46aa-932e-52e54a00f4f8","Type":"ContainerStarted","Data":"c2f8a1c310d1b24b4203efe63c0d743ff44a37ac68c306424d7c9cfb10c8f06b"} Nov 25 15:49:23 crc kubenswrapper[4965]: I1125 15:49:23.260696 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:49:23 crc kubenswrapper[4965]: I1125 15:49:23.261336 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:49:23 crc kubenswrapper[4965]: I1125 15:49:23.726440 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxp4" event={"ID":"709fa9cc-682f-46aa-932e-52e54a00f4f8","Type":"ContainerStarted","Data":"1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa"} Nov 25 15:49:26 crc kubenswrapper[4965]: I1125 15:49:26.748770 4965 generic.go:334] "Generic (PLEG): container finished" podID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerID="1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa" exitCode=0 Nov 25 15:49:26 crc kubenswrapper[4965]: I1125 15:49:26.749218 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxp4" event={"ID":"709fa9cc-682f-46aa-932e-52e54a00f4f8","Type":"ContainerDied","Data":"1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa"} Nov 25 15:49:28 crc kubenswrapper[4965]: I1125 15:49:28.805105 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxp4" event={"ID":"709fa9cc-682f-46aa-932e-52e54a00f4f8","Type":"ContainerStarted","Data":"54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82"} Nov 25 15:49:30 crc kubenswrapper[4965]: I1125 15:49:30.233728 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:30 crc kubenswrapper[4965]: I1125 15:49:30.234093 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:31 crc kubenswrapper[4965]: I1125 15:49:31.122747 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-4pfqd_8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73/kube-rbac-proxy/0.log" Nov 25 15:49:31 crc kubenswrapper[4965]: I1125 15:49:31.284652 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wvxp4" podUID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerName="registry-server" probeResult="failure" output=< Nov 25 15:49:31 crc kubenswrapper[4965]: timeout: failed to connect service ":50051" within 1s Nov 25 15:49:31 crc kubenswrapper[4965]: > Nov 25 15:49:31 crc kubenswrapper[4965]: I1125 15:49:31.318795 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-4pfqd_8fa594fd-579e-4f16-8ebc-3ecb8ec0ab73/controller/0.log" Nov 25 15:49:31 crc kubenswrapper[4965]: I1125 15:49:31.474934 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-frr-files/0.log" Nov 25 15:49:31 crc kubenswrapper[4965]: I1125 15:49:31.732765 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-frr-files/0.log" Nov 25 15:49:31 crc kubenswrapper[4965]: I1125 15:49:31.787565 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-metrics/0.log" Nov 25 15:49:31 crc kubenswrapper[4965]: I1125 15:49:31.800895 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-reloader/0.log" Nov 25 15:49:31 crc kubenswrapper[4965]: I1125 15:49:31.807905 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-reloader/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.030082 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-frr-files/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.108181 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-reloader/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.141476 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-metrics/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.160398 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-metrics/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.405563 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-metrics/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.433118 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-frr-files/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.490099 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/controller/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.508274 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/cp-reloader/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.711882 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/frr-metrics/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.715702 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/kube-rbac-proxy-frr/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.818872 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/kube-rbac-proxy/0.log" Nov 25 15:49:32 crc kubenswrapper[4965]: I1125 15:49:32.995943 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/reloader/0.log" Nov 25 15:49:33 crc kubenswrapper[4965]: I1125 15:49:33.268629 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-sbxmr_4ed3f916-aee7-4d42-b704-bf4e22789ce0/frr-k8s-webhook-server/0.log" Nov 25 15:49:33 crc kubenswrapper[4965]: I1125 15:49:33.504097 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6c8f57bc76-pdgb9_7f37b40d-6b30-4f51-a5d0-4767877a0fa3/manager/0.log" Nov 25 15:49:33 crc kubenswrapper[4965]: I1125 15:49:33.632019 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vf7m_ae8db73e-12e4-40b7-8d6b-d44b36b79b46/frr/0.log" Nov 25 15:49:33 crc kubenswrapper[4965]: I1125 15:49:33.832710 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5c874d5568-q92pm_a611d5d0-f5a3-4e37-baeb-73104c16018a/webhook-server/0.log" Nov 25 15:49:34 crc kubenswrapper[4965]: I1125 15:49:34.022625 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-j2bqm_41064ec4-3f9f-481d-8b5f-695a592ec58d/kube-rbac-proxy/0.log" Nov 25 15:49:34 crc kubenswrapper[4965]: I1125 15:49:34.316927 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-j2bqm_41064ec4-3f9f-481d-8b5f-695a592ec58d/speaker/0.log" Nov 25 15:49:40 crc kubenswrapper[4965]: I1125 15:49:40.288635 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:40 crc kubenswrapper[4965]: I1125 15:49:40.313339 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvxp4" podStartSLOduration=14.867710149 podStartE2EDuration="21.313321831s" podCreationTimestamp="2025-11-25 15:49:19 +0000 UTC" firstStartedPulling="2025-11-25 15:49:21.704044379 +0000 UTC m=+2706.671638145" lastFinishedPulling="2025-11-25 15:49:28.149656071 +0000 UTC m=+2713.117249827" observedRunningTime="2025-11-25 15:49:28.832671818 +0000 UTC m=+2713.800265574" watchObservedRunningTime="2025-11-25 15:49:40.313321831 +0000 UTC m=+2725.280915577" Nov 25 15:49:40 crc kubenswrapper[4965]: I1125 15:49:40.348043 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:40 crc kubenswrapper[4965]: I1125 15:49:40.534248 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvxp4"] Nov 25 15:49:41 crc kubenswrapper[4965]: I1125 15:49:41.898223 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvxp4" podUID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerName="registry-server" containerID="cri-o://54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82" gracePeriod=2 Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.393299 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.524062 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-catalog-content\") pod \"709fa9cc-682f-46aa-932e-52e54a00f4f8\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.524144 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-utilities\") pod \"709fa9cc-682f-46aa-932e-52e54a00f4f8\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.524201 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vclgb\" (UniqueName: \"kubernetes.io/projected/709fa9cc-682f-46aa-932e-52e54a00f4f8-kube-api-access-vclgb\") pod \"709fa9cc-682f-46aa-932e-52e54a00f4f8\" (UID: \"709fa9cc-682f-46aa-932e-52e54a00f4f8\") " Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.525241 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-utilities" (OuterVolumeSpecName: "utilities") pod "709fa9cc-682f-46aa-932e-52e54a00f4f8" (UID: "709fa9cc-682f-46aa-932e-52e54a00f4f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.541127 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709fa9cc-682f-46aa-932e-52e54a00f4f8-kube-api-access-vclgb" (OuterVolumeSpecName: "kube-api-access-vclgb") pod "709fa9cc-682f-46aa-932e-52e54a00f4f8" (UID: "709fa9cc-682f-46aa-932e-52e54a00f4f8"). InnerVolumeSpecName "kube-api-access-vclgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.626796 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.626839 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vclgb\" (UniqueName: \"kubernetes.io/projected/709fa9cc-682f-46aa-932e-52e54a00f4f8-kube-api-access-vclgb\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.626954 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "709fa9cc-682f-46aa-932e-52e54a00f4f8" (UID: "709fa9cc-682f-46aa-932e-52e54a00f4f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.728537 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709fa9cc-682f-46aa-932e-52e54a00f4f8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.910959 4965 generic.go:334] "Generic (PLEG): container finished" podID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerID="54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82" exitCode=0 Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.911048 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvxp4" Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.911064 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxp4" event={"ID":"709fa9cc-682f-46aa-932e-52e54a00f4f8","Type":"ContainerDied","Data":"54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82"} Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.911450 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvxp4" event={"ID":"709fa9cc-682f-46aa-932e-52e54a00f4f8","Type":"ContainerDied","Data":"c2f8a1c310d1b24b4203efe63c0d743ff44a37ac68c306424d7c9cfb10c8f06b"} Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.911478 4965 scope.go:117] "RemoveContainer" containerID="54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82" Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.940513 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvxp4"] Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.949281 4965 scope.go:117] "RemoveContainer" containerID="1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa" Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.950333 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvxp4"] Nov 25 15:49:42 crc kubenswrapper[4965]: I1125 15:49:42.997251 4965 scope.go:117] "RemoveContainer" containerID="92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92" Nov 25 15:49:43 crc kubenswrapper[4965]: I1125 15:49:43.032325 4965 scope.go:117] "RemoveContainer" containerID="54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82" Nov 25 15:49:43 crc kubenswrapper[4965]: E1125 15:49:43.032826 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82\": container with ID starting with 54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82 not found: ID does not exist" containerID="54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82" Nov 25 15:49:43 crc kubenswrapper[4965]: I1125 15:49:43.032863 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82"} err="failed to get container status \"54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82\": rpc error: code = NotFound desc = could not find container \"54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82\": container with ID starting with 54d7211705f22d60c9e0d6003a660ea0bf1238e03d239d703a6405f5cc19dd82 not found: ID does not exist" Nov 25 15:49:43 crc kubenswrapper[4965]: I1125 15:49:43.032885 4965 scope.go:117] "RemoveContainer" containerID="1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa" Nov 25 15:49:43 crc kubenswrapper[4965]: E1125 15:49:43.033272 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa\": container with ID starting with 1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa not found: ID does not exist" containerID="1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa" Nov 25 15:49:43 crc kubenswrapper[4965]: I1125 15:49:43.033291 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa"} err="failed to get container status \"1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa\": rpc error: code = NotFound desc = could not find container \"1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa\": container with ID starting with 1538a8b61c54875c6481d03b081a51b046d98ff2fc099d613ff5182b5da116aa not found: ID does not exist" Nov 25 15:49:43 crc kubenswrapper[4965]: I1125 15:49:43.033303 4965 scope.go:117] "RemoveContainer" containerID="92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92" Nov 25 15:49:43 crc kubenswrapper[4965]: E1125 15:49:43.033585 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92\": container with ID starting with 92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92 not found: ID does not exist" containerID="92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92" Nov 25 15:49:43 crc kubenswrapper[4965]: I1125 15:49:43.033610 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92"} err="failed to get container status \"92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92\": rpc error: code = NotFound desc = could not find container \"92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92\": container with ID starting with 92d656c779691e3cc328e5ad9ae5b5b5bac41d941fcf58ecbe7ad781acb49f92 not found: ID does not exist" Nov 25 15:49:44 crc kubenswrapper[4965]: I1125 15:49:44.781740 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709fa9cc-682f-46aa-932e-52e54a00f4f8" path="/var/lib/kubelet/pods/709fa9cc-682f-46aa-932e-52e54a00f4f8/volumes" Nov 25 15:49:47 crc kubenswrapper[4965]: I1125 15:49:47.823705 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56_519a4c2a-d39b-4e97-9634-622a4283f5c9/util/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.030412 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56_519a4c2a-d39b-4e97-9634-622a4283f5c9/util/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.040244 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56_519a4c2a-d39b-4e97-9634-622a4283f5c9/pull/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.077638 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56_519a4c2a-d39b-4e97-9634-622a4283f5c9/pull/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.226828 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56_519a4c2a-d39b-4e97-9634-622a4283f5c9/extract/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.251956 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56_519a4c2a-d39b-4e97-9634-622a4283f5c9/pull/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.311389 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ek4b56_519a4c2a-d39b-4e97-9634-622a4283f5c9/util/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.458205 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8587n_fa0430c6-ead3-4363-aea8-068563e1bdfe/extract-utilities/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.667979 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8587n_fa0430c6-ead3-4363-aea8-068563e1bdfe/extract-content/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.699280 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8587n_fa0430c6-ead3-4363-aea8-068563e1bdfe/extract-content/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.699578 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8587n_fa0430c6-ead3-4363-aea8-068563e1bdfe/extract-utilities/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.891316 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8587n_fa0430c6-ead3-4363-aea8-068563e1bdfe/extract-utilities/0.log" Nov 25 15:49:48 crc kubenswrapper[4965]: I1125 15:49:48.975503 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8587n_fa0430c6-ead3-4363-aea8-068563e1bdfe/extract-content/0.log" Nov 25 15:49:49 crc kubenswrapper[4965]: I1125 15:49:49.259451 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm2ks_81c585ba-c0e6-4733-bf34-424790b0fafc/extract-utilities/0.log" Nov 25 15:49:49 crc kubenswrapper[4965]: I1125 15:49:49.437580 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8587n_fa0430c6-ead3-4363-aea8-068563e1bdfe/registry-server/0.log" Nov 25 15:49:49 crc kubenswrapper[4965]: I1125 15:49:49.444192 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm2ks_81c585ba-c0e6-4733-bf34-424790b0fafc/extract-utilities/0.log" Nov 25 15:49:49 crc kubenswrapper[4965]: I1125 15:49:49.514584 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm2ks_81c585ba-c0e6-4733-bf34-424790b0fafc/extract-content/0.log" Nov 25 15:49:49 crc kubenswrapper[4965]: I1125 15:49:49.524504 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm2ks_81c585ba-c0e6-4733-bf34-424790b0fafc/extract-content/0.log" Nov 25 15:49:49 crc kubenswrapper[4965]: I1125 15:49:49.747938 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm2ks_81c585ba-c0e6-4733-bf34-424790b0fafc/extract-content/0.log" Nov 25 15:49:49 crc kubenswrapper[4965]: I1125 15:49:49.799359 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm2ks_81c585ba-c0e6-4733-bf34-424790b0fafc/extract-utilities/0.log" Nov 25 15:49:49 crc kubenswrapper[4965]: I1125 15:49:49.997879 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm2ks_81c585ba-c0e6-4733-bf34-424790b0fafc/registry-server/0.log" Nov 25 15:49:50 crc kubenswrapper[4965]: I1125 15:49:50.186120 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k_464b1b3b-03a0-41e6-842a-446cac908eea/util/0.log" Nov 25 15:49:50 crc kubenswrapper[4965]: I1125 15:49:50.403758 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k_464b1b3b-03a0-41e6-842a-446cac908eea/pull/0.log" Nov 25 15:49:50 crc kubenswrapper[4965]: I1125 15:49:50.428103 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k_464b1b3b-03a0-41e6-842a-446cac908eea/pull/0.log" Nov 25 15:49:50 crc kubenswrapper[4965]: I1125 15:49:50.436569 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k_464b1b3b-03a0-41e6-842a-446cac908eea/util/0.log" Nov 25 15:49:50 crc kubenswrapper[4965]: I1125 15:49:50.561855 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k_464b1b3b-03a0-41e6-842a-446cac908eea/pull/0.log" Nov 25 15:49:50 crc kubenswrapper[4965]: I1125 15:49:50.599883 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k_464b1b3b-03a0-41e6-842a-446cac908eea/util/0.log" Nov 25 15:49:50 crc kubenswrapper[4965]: I1125 15:49:50.653714 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6vs65k_464b1b3b-03a0-41e6-842a-446cac908eea/extract/0.log" Nov 25 15:49:50 crc kubenswrapper[4965]: I1125 15:49:50.819609 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-52qw9_86b81218-a04c-44e9-b4bc-efa18ee58d7e/marketplace-operator/0.log" Nov 25 15:49:50 crc kubenswrapper[4965]: I1125 15:49:50.934087 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6plmq_b439de70-154c-4dbf-99b2-4cc5e9f03996/extract-utilities/0.log" Nov 25 15:49:51 crc kubenswrapper[4965]: I1125 15:49:51.060511 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6plmq_b439de70-154c-4dbf-99b2-4cc5e9f03996/extract-utilities/0.log" Nov 25 15:49:51 crc kubenswrapper[4965]: I1125 15:49:51.111484 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6plmq_b439de70-154c-4dbf-99b2-4cc5e9f03996/extract-content/0.log" Nov 25 15:49:51 crc kubenswrapper[4965]: I1125 15:49:51.165858 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6plmq_b439de70-154c-4dbf-99b2-4cc5e9f03996/extract-content/0.log" Nov 25 15:49:51 crc kubenswrapper[4965]: I1125 15:49:51.422352 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6plmq_b439de70-154c-4dbf-99b2-4cc5e9f03996/extract-content/0.log" Nov 25 15:49:51 crc kubenswrapper[4965]: I1125 15:49:51.434898 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6plmq_b439de70-154c-4dbf-99b2-4cc5e9f03996/extract-utilities/0.log" Nov 25 15:49:51 crc kubenswrapper[4965]: I1125 15:49:51.636243 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6plmq_b439de70-154c-4dbf-99b2-4cc5e9f03996/registry-server/0.log" Nov 25 15:49:51 crc kubenswrapper[4965]: I1125 15:49:51.707856 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2c9sk_98e97c3d-45dc-4c17-86f0-06141c4b5b69/extract-utilities/0.log" Nov 25 15:49:51 crc kubenswrapper[4965]: I1125 15:49:51.897420 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2c9sk_98e97c3d-45dc-4c17-86f0-06141c4b5b69/extract-content/0.log" Nov 25 15:49:51 crc kubenswrapper[4965]: I1125 15:49:51.902784 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2c9sk_98e97c3d-45dc-4c17-86f0-06141c4b5b69/extract-content/0.log" Nov 25 15:49:51 crc kubenswrapper[4965]: I1125 15:49:51.948592 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2c9sk_98e97c3d-45dc-4c17-86f0-06141c4b5b69/extract-utilities/0.log" Nov 25 15:49:52 crc kubenswrapper[4965]: I1125 15:49:52.122994 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2c9sk_98e97c3d-45dc-4c17-86f0-06141c4b5b69/extract-utilities/0.log" Nov 25 15:49:52 crc kubenswrapper[4965]: I1125 15:49:52.178463 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2c9sk_98e97c3d-45dc-4c17-86f0-06141c4b5b69/extract-content/0.log" Nov 25 15:49:52 crc kubenswrapper[4965]: I1125 15:49:52.511117 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2c9sk_98e97c3d-45dc-4c17-86f0-06141c4b5b69/registry-server/0.log" Nov 25 15:49:53 crc kubenswrapper[4965]: I1125 15:49:53.261140 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:49:53 crc kubenswrapper[4965]: I1125 15:49:53.261219 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:50:21 crc kubenswrapper[4965]: E1125 15:50:21.622543 4965 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.176:33590->38.102.83.176:37963: write tcp 38.102.83.176:33590->38.102.83.176:37963: write: broken pipe Nov 25 15:50:23 crc kubenswrapper[4965]: I1125 15:50:23.259879 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:50:23 crc kubenswrapper[4965]: I1125 15:50:23.260206 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:50:23 crc kubenswrapper[4965]: I1125 15:50:23.260248 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:50:23 crc kubenswrapper[4965]: I1125 15:50:23.260975 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a1dca3b27555c377546c9201f462ced891b48f7a7c015e491a01ed360c56551"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:50:23 crc kubenswrapper[4965]: I1125 15:50:23.261055 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://6a1dca3b27555c377546c9201f462ced891b48f7a7c015e491a01ed360c56551" gracePeriod=600 Nov 25 15:50:23 crc kubenswrapper[4965]: E1125 15:50:23.541174 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ab112c4_45b9_468b_aa31_93b4f3c7444d.slice/crio-6a1dca3b27555c377546c9201f462ced891b48f7a7c015e491a01ed360c56551.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ab112c4_45b9_468b_aa31_93b4f3c7444d.slice/crio-conmon-6a1dca3b27555c377546c9201f462ced891b48f7a7c015e491a01ed360c56551.scope\": RecentStats: unable to find data in memory cache]" Nov 25 15:50:24 crc kubenswrapper[4965]: I1125 15:50:24.251332 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="6a1dca3b27555c377546c9201f462ced891b48f7a7c015e491a01ed360c56551" exitCode=0 Nov 25 15:50:24 crc kubenswrapper[4965]: I1125 15:50:24.251387 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"6a1dca3b27555c377546c9201f462ced891b48f7a7c015e491a01ed360c56551"} Nov 25 15:50:24 crc kubenswrapper[4965]: I1125 15:50:24.251754 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerStarted","Data":"401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee"} Nov 25 15:50:24 crc kubenswrapper[4965]: I1125 15:50:24.251777 4965 scope.go:117] "RemoveContainer" containerID="c721d271989b5aeea8981342646f3d47ee033d4bf096a39028a50f0c9963dd24" Nov 25 15:51:28 crc kubenswrapper[4965]: E1125 15:51:27.999156 4965 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.228s" Nov 25 15:51:49 crc kubenswrapper[4965]: I1125 15:51:49.166110 4965 generic.go:334] "Generic (PLEG): container finished" podID="c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" containerID="4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989" exitCode=0 Nov 25 15:51:49 crc kubenswrapper[4965]: I1125 15:51:49.166212 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5wld/must-gather-rkk4m" event={"ID":"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982","Type":"ContainerDied","Data":"4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989"} Nov 25 15:51:49 crc kubenswrapper[4965]: I1125 15:51:49.168377 4965 scope.go:117] "RemoveContainer" containerID="4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989" Nov 25 15:51:49 crc kubenswrapper[4965]: I1125 15:51:49.309306 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f5wld_must-gather-rkk4m_c62aeff2-c0b2-4ab2-aa47-8c427ee6c982/gather/0.log" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.453106 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r8rlr"] Nov 25 15:51:50 crc kubenswrapper[4965]: E1125 15:51:50.453925 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerName="extract-content" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.453941 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerName="extract-content" Nov 25 15:51:50 crc kubenswrapper[4965]: E1125 15:51:50.453998 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerName="extract-utilities" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.454008 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerName="extract-utilities" Nov 25 15:51:50 crc kubenswrapper[4965]: E1125 15:51:50.454022 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerName="registry-server" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.454030 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerName="registry-server" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.454362 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="709fa9cc-682f-46aa-932e-52e54a00f4f8" containerName="registry-server" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.455952 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.470979 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8rlr"] Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.597989 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-catalog-content\") pod \"certified-operators-r8rlr\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.598108 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-utilities\") pod \"certified-operators-r8rlr\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.598147 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7pvk\" (UniqueName: \"kubernetes.io/projected/f2cccf05-3ea1-400b-be52-d24739ac6dfd-kube-api-access-k7pvk\") pod \"certified-operators-r8rlr\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.699261 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-catalog-content\") pod \"certified-operators-r8rlr\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.699321 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-utilities\") pod \"certified-operators-r8rlr\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.699345 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7pvk\" (UniqueName: \"kubernetes.io/projected/f2cccf05-3ea1-400b-be52-d24739ac6dfd-kube-api-access-k7pvk\") pod \"certified-operators-r8rlr\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.699774 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-catalog-content\") pod \"certified-operators-r8rlr\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.699795 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-utilities\") pod \"certified-operators-r8rlr\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.725589 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7pvk\" (UniqueName: \"kubernetes.io/projected/f2cccf05-3ea1-400b-be52-d24739ac6dfd-kube-api-access-k7pvk\") pod \"certified-operators-r8rlr\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:50 crc kubenswrapper[4965]: I1125 15:51:50.781040 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:51:51 crc kubenswrapper[4965]: W1125 15:51:51.416175 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2cccf05_3ea1_400b_be52_d24739ac6dfd.slice/crio-394c217ce76974a97cfe94b8f69b0a3192cc9aecd557e8f3f055f80587bc7e5f WatchSource:0}: Error finding container 394c217ce76974a97cfe94b8f69b0a3192cc9aecd557e8f3f055f80587bc7e5f: Status 404 returned error can't find the container with id 394c217ce76974a97cfe94b8f69b0a3192cc9aecd557e8f3f055f80587bc7e5f Nov 25 15:51:51 crc kubenswrapper[4965]: I1125 15:51:51.417205 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8rlr"] Nov 25 15:51:52 crc kubenswrapper[4965]: I1125 15:51:52.206684 4965 generic.go:334] "Generic (PLEG): container finished" podID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerID="fcfd9d93316f52f61f80d18d5d26f80881bc8b3f6a7092cfb71b21982ae18226" exitCode=0 Nov 25 15:51:52 crc kubenswrapper[4965]: I1125 15:51:52.207077 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8rlr" event={"ID":"f2cccf05-3ea1-400b-be52-d24739ac6dfd","Type":"ContainerDied","Data":"fcfd9d93316f52f61f80d18d5d26f80881bc8b3f6a7092cfb71b21982ae18226"} Nov 25 15:51:52 crc kubenswrapper[4965]: I1125 15:51:52.207117 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8rlr" event={"ID":"f2cccf05-3ea1-400b-be52-d24739ac6dfd","Type":"ContainerStarted","Data":"394c217ce76974a97cfe94b8f69b0a3192cc9aecd557e8f3f055f80587bc7e5f"} Nov 25 15:51:52 crc kubenswrapper[4965]: I1125 15:51:52.210110 4965 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:51:53 crc kubenswrapper[4965]: I1125 15:51:53.218609 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8rlr" event={"ID":"f2cccf05-3ea1-400b-be52-d24739ac6dfd","Type":"ContainerStarted","Data":"04b1fbb70708f6128c1bb3d7a0a12d82d568a19f2f123df9815c960fb5470930"} Nov 25 15:51:54 crc kubenswrapper[4965]: I1125 15:51:54.228419 4965 generic.go:334] "Generic (PLEG): container finished" podID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerID="04b1fbb70708f6128c1bb3d7a0a12d82d568a19f2f123df9815c960fb5470930" exitCode=0 Nov 25 15:51:54 crc kubenswrapper[4965]: I1125 15:51:54.228592 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8rlr" event={"ID":"f2cccf05-3ea1-400b-be52-d24739ac6dfd","Type":"ContainerDied","Data":"04b1fbb70708f6128c1bb3d7a0a12d82d568a19f2f123df9815c960fb5470930"} Nov 25 15:51:55 crc kubenswrapper[4965]: I1125 15:51:55.237997 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8rlr" event={"ID":"f2cccf05-3ea1-400b-be52-d24739ac6dfd","Type":"ContainerStarted","Data":"ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535"} Nov 25 15:51:55 crc kubenswrapper[4965]: I1125 15:51:55.255837 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r8rlr" podStartSLOduration=2.807882439 podStartE2EDuration="5.255817854s" podCreationTimestamp="2025-11-25 15:51:50 +0000 UTC" firstStartedPulling="2025-11-25 15:51:52.209817447 +0000 UTC m=+2857.177411193" lastFinishedPulling="2025-11-25 15:51:54.657752852 +0000 UTC m=+2859.625346608" observedRunningTime="2025-11-25 15:51:55.252805412 +0000 UTC m=+2860.220399158" watchObservedRunningTime="2025-11-25 15:51:55.255817854 +0000 UTC m=+2860.223411600" Nov 25 15:51:57 crc kubenswrapper[4965]: I1125 15:51:57.739194 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f5wld/must-gather-rkk4m"] Nov 25 15:51:57 crc kubenswrapper[4965]: I1125 15:51:57.739764 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-f5wld/must-gather-rkk4m" podUID="c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" containerName="copy" containerID="cri-o://69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d" gracePeriod=2 Nov 25 15:51:57 crc kubenswrapper[4965]: I1125 15:51:57.750900 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f5wld/must-gather-rkk4m"] Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.204227 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f5wld_must-gather-rkk4m_c62aeff2-c0b2-4ab2-aa47-8c427ee6c982/copy/0.log" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.204818 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/must-gather-rkk4m" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.262904 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f5wld_must-gather-rkk4m_c62aeff2-c0b2-4ab2-aa47-8c427ee6c982/copy/0.log" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.263294 4965 generic.go:334] "Generic (PLEG): container finished" podID="c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" containerID="69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d" exitCode=143 Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.263375 4965 scope.go:117] "RemoveContainer" containerID="69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.263385 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5wld/must-gather-rkk4m" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.283485 4965 scope.go:117] "RemoveContainer" containerID="4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.342714 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98l8s\" (UniqueName: \"kubernetes.io/projected/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-kube-api-access-98l8s\") pod \"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982\" (UID: \"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982\") " Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.342986 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-must-gather-output\") pod \"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982\" (UID: \"c62aeff2-c0b2-4ab2-aa47-8c427ee6c982\") " Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.351169 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-kube-api-access-98l8s" (OuterVolumeSpecName: "kube-api-access-98l8s") pod "c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" (UID: "c62aeff2-c0b2-4ab2-aa47-8c427ee6c982"). InnerVolumeSpecName "kube-api-access-98l8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.358747 4965 scope.go:117] "RemoveContainer" containerID="69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d" Nov 25 15:51:58 crc kubenswrapper[4965]: E1125 15:51:58.361368 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d\": container with ID starting with 69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d not found: ID does not exist" containerID="69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.361542 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d"} err="failed to get container status \"69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d\": rpc error: code = NotFound desc = could not find container \"69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d\": container with ID starting with 69c843831e752b96738bbea0036a094976f3c88ed353bd131d725c7b0b1ccf4d not found: ID does not exist" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.361630 4965 scope.go:117] "RemoveContainer" containerID="4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989" Nov 25 15:51:58 crc kubenswrapper[4965]: E1125 15:51:58.362136 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989\": container with ID starting with 4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989 not found: ID does not exist" containerID="4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.362273 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989"} err="failed to get container status \"4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989\": rpc error: code = NotFound desc = could not find container \"4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989\": container with ID starting with 4dd576bf99284b996c8bbca6b7a3b68cc79926dd669ae500b96525f4a1e28989 not found: ID does not exist" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.445914 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98l8s\" (UniqueName: \"kubernetes.io/projected/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-kube-api-access-98l8s\") on node \"crc\" DevicePath \"\"" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.496866 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" (UID: "c62aeff2-c0b2-4ab2-aa47-8c427ee6c982"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.547983 4965 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 25 15:51:58 crc kubenswrapper[4965]: I1125 15:51:58.781030 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" path="/var/lib/kubelet/pods/c62aeff2-c0b2-4ab2-aa47-8c427ee6c982/volumes" Nov 25 15:52:00 crc kubenswrapper[4965]: I1125 15:52:00.781928 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:52:00 crc kubenswrapper[4965]: I1125 15:52:00.782699 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:52:00 crc kubenswrapper[4965]: I1125 15:52:00.827316 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:52:01 crc kubenswrapper[4965]: I1125 15:52:01.356213 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:52:01 crc kubenswrapper[4965]: I1125 15:52:01.400743 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8rlr"] Nov 25 15:52:03 crc kubenswrapper[4965]: I1125 15:52:03.320089 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r8rlr" podUID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerName="registry-server" containerID="cri-o://ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535" gracePeriod=2 Nov 25 15:52:10 crc kubenswrapper[4965]: E1125 15:52:10.783002 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535 is running failed: container process not found" containerID="ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:52:10 crc kubenswrapper[4965]: E1125 15:52:10.784399 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535 is running failed: container process not found" containerID="ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:52:10 crc kubenswrapper[4965]: E1125 15:52:10.784818 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535 is running failed: container process not found" containerID="ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:52:10 crc kubenswrapper[4965]: E1125 15:52:10.784899 4965 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-r8rlr" podUID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerName="registry-server" Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.233202 4965 generic.go:334] "Generic (PLEG): container finished" podID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerID="ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535" exitCode=0 Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.233730 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8rlr" event={"ID":"f2cccf05-3ea1-400b-be52-d24739ac6dfd","Type":"ContainerDied","Data":"ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535"} Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.313231 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.401668 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7pvk\" (UniqueName: \"kubernetes.io/projected/f2cccf05-3ea1-400b-be52-d24739ac6dfd-kube-api-access-k7pvk\") pod \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.401726 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-utilities\") pod \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.401759 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-catalog-content\") pod \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\" (UID: \"f2cccf05-3ea1-400b-be52-d24739ac6dfd\") " Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.403451 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-utilities" (OuterVolumeSpecName: "utilities") pod "f2cccf05-3ea1-400b-be52-d24739ac6dfd" (UID: "f2cccf05-3ea1-400b-be52-d24739ac6dfd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.407274 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2cccf05-3ea1-400b-be52-d24739ac6dfd-kube-api-access-k7pvk" (OuterVolumeSpecName: "kube-api-access-k7pvk") pod "f2cccf05-3ea1-400b-be52-d24739ac6dfd" (UID: "f2cccf05-3ea1-400b-be52-d24739ac6dfd"). InnerVolumeSpecName "kube-api-access-k7pvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.463822 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2cccf05-3ea1-400b-be52-d24739ac6dfd" (UID: "f2cccf05-3ea1-400b-be52-d24739ac6dfd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.504375 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7pvk\" (UniqueName: \"kubernetes.io/projected/f2cccf05-3ea1-400b-be52-d24739ac6dfd-kube-api-access-k7pvk\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.504423 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:11 crc kubenswrapper[4965]: I1125 15:52:11.504440 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2cccf05-3ea1-400b-be52-d24739ac6dfd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:12 crc kubenswrapper[4965]: I1125 15:52:12.242942 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8rlr" event={"ID":"f2cccf05-3ea1-400b-be52-d24739ac6dfd","Type":"ContainerDied","Data":"394c217ce76974a97cfe94b8f69b0a3192cc9aecd557e8f3f055f80587bc7e5f"} Nov 25 15:52:12 crc kubenswrapper[4965]: I1125 15:52:12.243018 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8rlr" Nov 25 15:52:12 crc kubenswrapper[4965]: I1125 15:52:12.243395 4965 scope.go:117] "RemoveContainer" containerID="ccad28ea673bf824649f743c2428f2002f279ae9577641deb0801f5ffa587535" Nov 25 15:52:12 crc kubenswrapper[4965]: I1125 15:52:12.264886 4965 scope.go:117] "RemoveContainer" containerID="04b1fbb70708f6128c1bb3d7a0a12d82d568a19f2f123df9815c960fb5470930" Nov 25 15:52:12 crc kubenswrapper[4965]: I1125 15:52:12.276732 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8rlr"] Nov 25 15:52:12 crc kubenswrapper[4965]: I1125 15:52:12.288032 4965 scope.go:117] "RemoveContainer" containerID="fcfd9d93316f52f61f80d18d5d26f80881bc8b3f6a7092cfb71b21982ae18226" Nov 25 15:52:12 crc kubenswrapper[4965]: I1125 15:52:12.299194 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r8rlr"] Nov 25 15:52:12 crc kubenswrapper[4965]: I1125 15:52:12.798758 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" path="/var/lib/kubelet/pods/f2cccf05-3ea1-400b-be52-d24739ac6dfd/volumes" Nov 25 15:52:23 crc kubenswrapper[4965]: I1125 15:52:23.260437 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:52:23 crc kubenswrapper[4965]: I1125 15:52:23.260950 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:52:53 crc kubenswrapper[4965]: I1125 15:52:53.260859 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:52:53 crc kubenswrapper[4965]: I1125 15:52:53.261382 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:53:11 crc kubenswrapper[4965]: I1125 15:53:11.990311 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5vb2j"] Nov 25 15:53:11 crc kubenswrapper[4965]: E1125 15:53:11.991251 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerName="extract-utilities" Nov 25 15:53:11 crc kubenswrapper[4965]: I1125 15:53:11.991340 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerName="extract-utilities" Nov 25 15:53:11 crc kubenswrapper[4965]: E1125 15:53:11.991358 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" containerName="copy" Nov 25 15:53:11 crc kubenswrapper[4965]: I1125 15:53:11.991366 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" containerName="copy" Nov 25 15:53:11 crc kubenswrapper[4965]: E1125 15:53:11.991390 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerName="registry-server" Nov 25 15:53:11 crc kubenswrapper[4965]: I1125 15:53:11.991399 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerName="registry-server" Nov 25 15:53:11 crc kubenswrapper[4965]: E1125 15:53:11.991408 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerName="extract-content" Nov 25 15:53:11 crc kubenswrapper[4965]: I1125 15:53:11.991415 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerName="extract-content" Nov 25 15:53:11 crc kubenswrapper[4965]: E1125 15:53:11.991428 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" containerName="gather" Nov 25 15:53:11 crc kubenswrapper[4965]: I1125 15:53:11.991435 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" containerName="gather" Nov 25 15:53:11 crc kubenswrapper[4965]: I1125 15:53:11.991652 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2cccf05-3ea1-400b-be52-d24739ac6dfd" containerName="registry-server" Nov 25 15:53:11 crc kubenswrapper[4965]: I1125 15:53:11.991670 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" containerName="gather" Nov 25 15:53:11 crc kubenswrapper[4965]: I1125 15:53:11.991690 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c62aeff2-c0b2-4ab2-aa47-8c427ee6c982" containerName="copy" Nov 25 15:53:11 crc kubenswrapper[4965]: I1125 15:53:11.993015 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.003769 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vb2j"] Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.092881 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-utilities\") pod \"redhat-marketplace-5vb2j\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.092932 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqd2n\" (UniqueName: \"kubernetes.io/projected/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-kube-api-access-rqd2n\") pod \"redhat-marketplace-5vb2j\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.093004 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-catalog-content\") pod \"redhat-marketplace-5vb2j\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.194248 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-utilities\") pod \"redhat-marketplace-5vb2j\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.194313 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqd2n\" (UniqueName: \"kubernetes.io/projected/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-kube-api-access-rqd2n\") pod \"redhat-marketplace-5vb2j\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.194369 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-catalog-content\") pod \"redhat-marketplace-5vb2j\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.195040 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-utilities\") pod \"redhat-marketplace-5vb2j\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.195040 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-catalog-content\") pod \"redhat-marketplace-5vb2j\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.215716 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqd2n\" (UniqueName: \"kubernetes.io/projected/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-kube-api-access-rqd2n\") pod \"redhat-marketplace-5vb2j\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.325063 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:12 crc kubenswrapper[4965]: W1125 15:53:12.782678 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68574c2b_4a6d_47a4_bd61_1badfb2a8bfe.slice/crio-2da48244d2dcde5d9d1efc622909248ef4a62363e0924f3888bc1c9bd43b6336 WatchSource:0}: Error finding container 2da48244d2dcde5d9d1efc622909248ef4a62363e0924f3888bc1c9bd43b6336: Status 404 returned error can't find the container with id 2da48244d2dcde5d9d1efc622909248ef4a62363e0924f3888bc1c9bd43b6336 Nov 25 15:53:12 crc kubenswrapper[4965]: I1125 15:53:12.783209 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vb2j"] Nov 25 15:53:13 crc kubenswrapper[4965]: I1125 15:53:13.756745 4965 generic.go:334] "Generic (PLEG): container finished" podID="68574c2b-4a6d-47a4-bd61-1badfb2a8bfe" containerID="53825843a69575c5573e854d56362b2631c15a898cc15b01e025bcd01d6caa55" exitCode=0 Nov 25 15:53:13 crc kubenswrapper[4965]: I1125 15:53:13.756936 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vb2j" event={"ID":"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe","Type":"ContainerDied","Data":"53825843a69575c5573e854d56362b2631c15a898cc15b01e025bcd01d6caa55"} Nov 25 15:53:13 crc kubenswrapper[4965]: I1125 15:53:13.757087 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vb2j" event={"ID":"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe","Type":"ContainerStarted","Data":"2da48244d2dcde5d9d1efc622909248ef4a62363e0924f3888bc1c9bd43b6336"} Nov 25 15:53:14 crc kubenswrapper[4965]: I1125 15:53:14.767252 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vb2j" event={"ID":"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe","Type":"ContainerStarted","Data":"d95ab2430566159490e32897029c25da58fe21f1aa2c903080e844c0019498a0"} Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.600303 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r55wb"] Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.603876 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.639298 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r55wb"] Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.757603 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-catalog-content\") pod \"community-operators-r55wb\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.757732 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-utilities\") pod \"community-operators-r55wb\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.757804 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmctp\" (UniqueName: \"kubernetes.io/projected/895a5803-5bf4-4e47-9f2b-de911bb36acd-kube-api-access-jmctp\") pod \"community-operators-r55wb\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.776564 4965 generic.go:334] "Generic (PLEG): container finished" podID="68574c2b-4a6d-47a4-bd61-1badfb2a8bfe" containerID="d95ab2430566159490e32897029c25da58fe21f1aa2c903080e844c0019498a0" exitCode=0 Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.776600 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vb2j" event={"ID":"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe","Type":"ContainerDied","Data":"d95ab2430566159490e32897029c25da58fe21f1aa2c903080e844c0019498a0"} Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.859902 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-utilities\") pod \"community-operators-r55wb\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.860074 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmctp\" (UniqueName: \"kubernetes.io/projected/895a5803-5bf4-4e47-9f2b-de911bb36acd-kube-api-access-jmctp\") pod \"community-operators-r55wb\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.860251 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-catalog-content\") pod \"community-operators-r55wb\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.860433 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-utilities\") pod \"community-operators-r55wb\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.861042 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-catalog-content\") pod \"community-operators-r55wb\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.889587 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmctp\" (UniqueName: \"kubernetes.io/projected/895a5803-5bf4-4e47-9f2b-de911bb36acd-kube-api-access-jmctp\") pod \"community-operators-r55wb\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:15 crc kubenswrapper[4965]: I1125 15:53:15.926310 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:16 crc kubenswrapper[4965]: I1125 15:53:16.473831 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r55wb"] Nov 25 15:53:16 crc kubenswrapper[4965]: I1125 15:53:16.790248 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r55wb" event={"ID":"895a5803-5bf4-4e47-9f2b-de911bb36acd","Type":"ContainerStarted","Data":"b0dd24c4c93c128ba2018bab750a5c9355d6eef37ba4900e63ab7fdaad79485d"} Nov 25 15:53:17 crc kubenswrapper[4965]: I1125 15:53:17.802823 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vb2j" event={"ID":"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe","Type":"ContainerStarted","Data":"9bfdceb128c023e2a595a2d413cc841387157f082ca2e5b1c2c8f7b178a39e68"} Nov 25 15:53:17 crc kubenswrapper[4965]: I1125 15:53:17.804373 4965 generic.go:334] "Generic (PLEG): container finished" podID="895a5803-5bf4-4e47-9f2b-de911bb36acd" containerID="eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0" exitCode=0 Nov 25 15:53:17 crc kubenswrapper[4965]: I1125 15:53:17.804414 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r55wb" event={"ID":"895a5803-5bf4-4e47-9f2b-de911bb36acd","Type":"ContainerDied","Data":"eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0"} Nov 25 15:53:17 crc kubenswrapper[4965]: I1125 15:53:17.832556 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5vb2j" podStartSLOduration=4.130514208 podStartE2EDuration="6.832534682s" podCreationTimestamp="2025-11-25 15:53:11 +0000 UTC" firstStartedPulling="2025-11-25 15:53:13.762565068 +0000 UTC m=+2938.730158814" lastFinishedPulling="2025-11-25 15:53:16.464585542 +0000 UTC m=+2941.432179288" observedRunningTime="2025-11-25 15:53:17.825095939 +0000 UTC m=+2942.792689685" watchObservedRunningTime="2025-11-25 15:53:17.832534682 +0000 UTC m=+2942.800128428" Nov 25 15:53:19 crc kubenswrapper[4965]: I1125 15:53:19.821513 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r55wb" event={"ID":"895a5803-5bf4-4e47-9f2b-de911bb36acd","Type":"ContainerStarted","Data":"4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1"} Nov 25 15:53:22 crc kubenswrapper[4965]: I1125 15:53:22.326188 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:22 crc kubenswrapper[4965]: I1125 15:53:22.326532 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:22 crc kubenswrapper[4965]: I1125 15:53:22.410213 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:22 crc kubenswrapper[4965]: I1125 15:53:22.904073 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.260617 4965 patch_prober.go:28] interesting pod/machine-config-daemon-x42s2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.260674 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.260720 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.261485 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee"} pod="openshift-machine-config-operator/machine-config-daemon-x42s2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.261540 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerName="machine-config-daemon" containerID="cri-o://401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" gracePeriod=600 Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.385709 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vb2j"] Nov 25 15:53:23 crc kubenswrapper[4965]: E1125 15:53:23.392244 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.860331 4965 generic.go:334] "Generic (PLEG): container finished" podID="895a5803-5bf4-4e47-9f2b-de911bb36acd" containerID="4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1" exitCode=0 Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.860657 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r55wb" event={"ID":"895a5803-5bf4-4e47-9f2b-de911bb36acd","Type":"ContainerDied","Data":"4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1"} Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.864895 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" exitCode=0 Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.865319 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" event={"ID":"7ab112c4-45b9-468b-aa31-93b4f3c7444d","Type":"ContainerDied","Data":"401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee"} Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.865383 4965 scope.go:117] "RemoveContainer" containerID="6a1dca3b27555c377546c9201f462ced891b48f7a7c015e491a01ed360c56551" Nov 25 15:53:23 crc kubenswrapper[4965]: I1125 15:53:23.866247 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:53:23 crc kubenswrapper[4965]: E1125 15:53:23.866486 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:53:24 crc kubenswrapper[4965]: I1125 15:53:24.876591 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5vb2j" podUID="68574c2b-4a6d-47a4-bd61-1badfb2a8bfe" containerName="registry-server" containerID="cri-o://9bfdceb128c023e2a595a2d413cc841387157f082ca2e5b1c2c8f7b178a39e68" gracePeriod=2 Nov 25 15:53:25 crc kubenswrapper[4965]: I1125 15:53:25.886616 4965 generic.go:334] "Generic (PLEG): container finished" podID="68574c2b-4a6d-47a4-bd61-1badfb2a8bfe" containerID="9bfdceb128c023e2a595a2d413cc841387157f082ca2e5b1c2c8f7b178a39e68" exitCode=0 Nov 25 15:53:25 crc kubenswrapper[4965]: I1125 15:53:25.886938 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vb2j" event={"ID":"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe","Type":"ContainerDied","Data":"9bfdceb128c023e2a595a2d413cc841387157f082ca2e5b1c2c8f7b178a39e68"} Nov 25 15:53:25 crc kubenswrapper[4965]: I1125 15:53:25.886994 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vb2j" event={"ID":"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe","Type":"ContainerDied","Data":"2da48244d2dcde5d9d1efc622909248ef4a62363e0924f3888bc1c9bd43b6336"} Nov 25 15:53:25 crc kubenswrapper[4965]: I1125 15:53:25.887006 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2da48244d2dcde5d9d1efc622909248ef4a62363e0924f3888bc1c9bd43b6336" Nov 25 15:53:25 crc kubenswrapper[4965]: I1125 15:53:25.891195 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r55wb" event={"ID":"895a5803-5bf4-4e47-9f2b-de911bb36acd","Type":"ContainerStarted","Data":"1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09"} Nov 25 15:53:25 crc kubenswrapper[4965]: I1125 15:53:25.905735 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:25 crc kubenswrapper[4965]: I1125 15:53:25.926597 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:25 crc kubenswrapper[4965]: I1125 15:53:25.926649 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:25 crc kubenswrapper[4965]: I1125 15:53:25.930624 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r55wb" podStartSLOduration=4.039015157 podStartE2EDuration="10.930499014s" podCreationTimestamp="2025-11-25 15:53:15 +0000 UTC" firstStartedPulling="2025-11-25 15:53:17.806590265 +0000 UTC m=+2942.774184011" lastFinishedPulling="2025-11-25 15:53:24.698074112 +0000 UTC m=+2949.665667868" observedRunningTime="2025-11-25 15:53:25.911172297 +0000 UTC m=+2950.878766043" watchObservedRunningTime="2025-11-25 15:53:25.930499014 +0000 UTC m=+2950.898092770" Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.067949 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqd2n\" (UniqueName: \"kubernetes.io/projected/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-kube-api-access-rqd2n\") pod \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.068165 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-catalog-content\") pod \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.068277 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-utilities\") pod \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\" (UID: \"68574c2b-4a6d-47a4-bd61-1badfb2a8bfe\") " Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.069724 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-utilities" (OuterVolumeSpecName: "utilities") pod "68574c2b-4a6d-47a4-bd61-1badfb2a8bfe" (UID: "68574c2b-4a6d-47a4-bd61-1badfb2a8bfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.073753 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-kube-api-access-rqd2n" (OuterVolumeSpecName: "kube-api-access-rqd2n") pod "68574c2b-4a6d-47a4-bd61-1badfb2a8bfe" (UID: "68574c2b-4a6d-47a4-bd61-1badfb2a8bfe"). InnerVolumeSpecName "kube-api-access-rqd2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.087029 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68574c2b-4a6d-47a4-bd61-1badfb2a8bfe" (UID: "68574c2b-4a6d-47a4-bd61-1badfb2a8bfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.178420 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.178459 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqd2n\" (UniqueName: \"kubernetes.io/projected/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-kube-api-access-rqd2n\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.178470 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.910636 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vb2j" Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.938666 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vb2j"] Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.949849 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vb2j"] Nov 25 15:53:26 crc kubenswrapper[4965]: I1125 15:53:26.983254 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-r55wb" podUID="895a5803-5bf4-4e47-9f2b-de911bb36acd" containerName="registry-server" probeResult="failure" output=< Nov 25 15:53:26 crc kubenswrapper[4965]: timeout: failed to connect service ":50051" within 1s Nov 25 15:53:26 crc kubenswrapper[4965]: > Nov 25 15:53:28 crc kubenswrapper[4965]: I1125 15:53:28.787766 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68574c2b-4a6d-47a4-bd61-1badfb2a8bfe" path="/var/lib/kubelet/pods/68574c2b-4a6d-47a4-bd61-1badfb2a8bfe/volumes" Nov 25 15:53:35 crc kubenswrapper[4965]: I1125 15:53:35.771314 4965 scope.go:117] "RemoveContainer" containerID="7322451c391900e4e01cef874957ca23d58761d0936c424754cc70706bd8f3ed" Nov 25 15:53:35 crc kubenswrapper[4965]: I1125 15:53:35.983631 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:36 crc kubenswrapper[4965]: I1125 15:53:36.041638 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:36 crc kubenswrapper[4965]: I1125 15:53:36.226066 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r55wb"] Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.088675 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r55wb" podUID="895a5803-5bf4-4e47-9f2b-de911bb36acd" containerName="registry-server" containerID="cri-o://1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09" gracePeriod=2 Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.604189 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.768460 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmctp\" (UniqueName: \"kubernetes.io/projected/895a5803-5bf4-4e47-9f2b-de911bb36acd-kube-api-access-jmctp\") pod \"895a5803-5bf4-4e47-9f2b-de911bb36acd\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.768542 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-catalog-content\") pod \"895a5803-5bf4-4e47-9f2b-de911bb36acd\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.768765 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-utilities\") pod \"895a5803-5bf4-4e47-9f2b-de911bb36acd\" (UID: \"895a5803-5bf4-4e47-9f2b-de911bb36acd\") " Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.769571 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-utilities" (OuterVolumeSpecName: "utilities") pod "895a5803-5bf4-4e47-9f2b-de911bb36acd" (UID: "895a5803-5bf4-4e47-9f2b-de911bb36acd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.774230 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895a5803-5bf4-4e47-9f2b-de911bb36acd-kube-api-access-jmctp" (OuterVolumeSpecName: "kube-api-access-jmctp") pod "895a5803-5bf4-4e47-9f2b-de911bb36acd" (UID: "895a5803-5bf4-4e47-9f2b-de911bb36acd"). InnerVolumeSpecName "kube-api-access-jmctp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.835198 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "895a5803-5bf4-4e47-9f2b-de911bb36acd" (UID: "895a5803-5bf4-4e47-9f2b-de911bb36acd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.870907 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.870955 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmctp\" (UniqueName: \"kubernetes.io/projected/895a5803-5bf4-4e47-9f2b-de911bb36acd-kube-api-access-jmctp\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:37 crc kubenswrapper[4965]: I1125 15:53:37.870985 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895a5803-5bf4-4e47-9f2b-de911bb36acd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.098561 4965 generic.go:334] "Generic (PLEG): container finished" podID="895a5803-5bf4-4e47-9f2b-de911bb36acd" containerID="1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09" exitCode=0 Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.098912 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r55wb" event={"ID":"895a5803-5bf4-4e47-9f2b-de911bb36acd","Type":"ContainerDied","Data":"1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09"} Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.098941 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r55wb" event={"ID":"895a5803-5bf4-4e47-9f2b-de911bb36acd","Type":"ContainerDied","Data":"b0dd24c4c93c128ba2018bab750a5c9355d6eef37ba4900e63ab7fdaad79485d"} Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.098957 4965 scope.go:117] "RemoveContainer" containerID="1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.099116 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r55wb" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.117523 4965 scope.go:117] "RemoveContainer" containerID="4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.141694 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r55wb"] Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.152272 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r55wb"] Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.156962 4965 scope.go:117] "RemoveContainer" containerID="eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.188717 4965 scope.go:117] "RemoveContainer" containerID="1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09" Nov 25 15:53:38 crc kubenswrapper[4965]: E1125 15:53:38.189206 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09\": container with ID starting with 1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09 not found: ID does not exist" containerID="1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.189271 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09"} err="failed to get container status \"1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09\": rpc error: code = NotFound desc = could not find container \"1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09\": container with ID starting with 1f5dd0d3afad9aaebcc1d582d8688251418d0364cc564d4be1ecb8a42493df09 not found: ID does not exist" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.189310 4965 scope.go:117] "RemoveContainer" containerID="4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1" Nov 25 15:53:38 crc kubenswrapper[4965]: E1125 15:53:38.189926 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1\": container with ID starting with 4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1 not found: ID does not exist" containerID="4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.190066 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1"} err="failed to get container status \"4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1\": rpc error: code = NotFound desc = could not find container \"4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1\": container with ID starting with 4a379c6ad733ed713333fcc5d9a503ffec3e9f473db942269c7c97a779ad35b1 not found: ID does not exist" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.190096 4965 scope.go:117] "RemoveContainer" containerID="eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0" Nov 25 15:53:38 crc kubenswrapper[4965]: E1125 15:53:38.190723 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0\": container with ID starting with eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0 not found: ID does not exist" containerID="eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.190757 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0"} err="failed to get container status \"eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0\": rpc error: code = NotFound desc = could not find container \"eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0\": container with ID starting with eda2704046937821a271c8c535843ce6f1833c1adc0ee55ba1df2a5c0c4454d0 not found: ID does not exist" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.774472 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:53:38 crc kubenswrapper[4965]: E1125 15:53:38.774677 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:53:38 crc kubenswrapper[4965]: I1125 15:53:38.783609 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="895a5803-5bf4-4e47-9f2b-de911bb36acd" path="/var/lib/kubelet/pods/895a5803-5bf4-4e47-9f2b-de911bb36acd/volumes" Nov 25 15:53:53 crc kubenswrapper[4965]: I1125 15:53:53.772064 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:53:53 crc kubenswrapper[4965]: E1125 15:53:53.773413 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:54:05 crc kubenswrapper[4965]: I1125 15:54:05.771723 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:54:05 crc kubenswrapper[4965]: E1125 15:54:05.772530 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:54:16 crc kubenswrapper[4965]: I1125 15:54:16.771278 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:54:16 crc kubenswrapper[4965]: E1125 15:54:16.772163 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:54:27 crc kubenswrapper[4965]: I1125 15:54:27.771321 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:54:27 crc kubenswrapper[4965]: E1125 15:54:27.772008 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:54:42 crc kubenswrapper[4965]: I1125 15:54:42.774089 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:54:42 crc kubenswrapper[4965]: E1125 15:54:42.775052 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:54:54 crc kubenswrapper[4965]: I1125 15:54:54.772228 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:54:54 crc kubenswrapper[4965]: E1125 15:54:54.772950 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:55:06 crc kubenswrapper[4965]: I1125 15:55:06.781134 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:55:06 crc kubenswrapper[4965]: E1125 15:55:06.782399 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:55:21 crc kubenswrapper[4965]: I1125 15:55:21.772051 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:55:21 crc kubenswrapper[4965]: E1125 15:55:21.773766 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:55:32 crc kubenswrapper[4965]: I1125 15:55:32.772716 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:55:32 crc kubenswrapper[4965]: E1125 15:55:32.774235 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d" Nov 25 15:55:43 crc kubenswrapper[4965]: I1125 15:55:43.772167 4965 scope.go:117] "RemoveContainer" containerID="401484b9bfc933b94a4a9b38ffb78951be608230cb54002e396275897a88b6ee" Nov 25 15:55:43 crc kubenswrapper[4965]: E1125 15:55:43.773036 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x42s2_openshift-machine-config-operator(7ab112c4-45b9-468b-aa31-93b4f3c7444d)\"" pod="openshift-machine-config-operator/machine-config-daemon-x42s2" podUID="7ab112c4-45b9-468b-aa31-93b4f3c7444d"